CONT R A R I A
Technology has alwa
y s b e en a b out
hope. Since the beginning
of the industrial
revolution, businesses
have embraced new
technologies enthusiastically,
and their
optimism has been
rewarded with improved
proce s s e s ,
lower costs and reduced
workforces. As the pace of technological
innovation has intensified over the past two decades,
businesses have come to expect that the next
new thing will inevitably bring them larger market
opportunities and bigger profits. Software, a technology
so invisible and obscure to most of us that it
appears to work like magic, especially lends itself to
this kind of open-ended hope.
Software promises evolutions, revolutions and
even transformations in how companies do business.
The triumphant vision many buy into is that
enterprise software in large organizations is fully
integrated and intelligently controls infinitely complex
business processes while remaining flexible
enough to adapt to changing business needs. This
vision of software lies at the core of what Thomas
Friedman in The World Is Flat calls “the Wal-Mart
Symphony in multiple movements — with no finale.
It just plays over and over 24/7/365.”1 Whole
systems march in lock step, providing synchronized,
fully coordinated supply chains, production
lines and services, just like a world class orchestra.
From online web orders through fulfillment, delivery,
billing and customer service — the entire
enterprise, organized end to end — that has been
the promise. The age of smart machines would
seem to be upon us.
Or is it? While a few companies like Wal-Mart
Stores Inc. have achieved something close to that
ideal, the way most large organizations actually
process information belies that glorious vision and
reveals a looking-glass world, where everything is
in fact the opposite of what one might expect. Back
office systems — including both software applications
and the data they process — are a variegated
patchwork of systems, containing 50 or more databases
and hundreds of separate software programs
installed over decades and interconnected by idiosyncratic,
Byzantine and poorly documented
customized processes. To manage this growing
complexity, IT departments have grown substantially:
As a percentage of total investment, IT rose
from 2.6% to 3.5% between 1970 and 1980.2 By
1990 IT consumed 9%, and by 1999 a whopping
22% of total investment went to IT. Growth in IT
spending has fallen off, but it is nonetheless surprising
to hear that today’s IT departments spend
70% to 80% of their budgets just trying to keep existing
systems running.
According to a multiyear study of over 400 companies
by MIT researchers Jeanne Ross, Peter Weill
and David Robertson,3 IT departments tend not to
be innovative leaders within organizations, but
rather conservative forces, viewed by business executives
as cost sinks and liabilities. In many
companies, it takes the IT department one to two
years to implement a new strategic initiative --
hardly the agility companies are striving for. The
research shows the typical IT structure is so dense
and extensive that it’s often a miracle that it works
at all. The researchers observe: “Legacy systems
cobbled together to respond to each new business
initiative create rigidity and excessive costs. Every
change becomes a risky, expensive venture.”
The Proliferation of Complexity
How did this happen? James Cordata, who has
written extensively about the information economy,
points out that as work became more complex
and specialized over the 20th century, the use of
data — numbers and facts — as fodder for more
and more analysis and fact-based decision making
intensified. And digital technology “was perfect for
this kind of world.”4 Of course, digital technology
The Trouble With Enterprise Software
Has enterprise
software become
too complex to
be effective?
Cynth ia Rett ig
Illustration: TK/theispot.com FALL 2007 MIT SLOAN MANAGEMENT REVIEW 21
This article, originally
published on the
MIT Sloan Management
Review website,
stirred up a good deal
of discussion in the
blogosphere, a sampling
of which is included in
the following pages.
22 MIT SLOAN MANAGEMENT REVIEW FALL 2007 sloanreview .mit .edu
not only supported that complexity but also played
a large part in actually creating it, weaving a continuous
web of unending data. “More computers
are better than fewer” remains a key belief of American
business, Cordata says. “There are no limits to
how much is good.” Management became accustomed
to the idea that buying more computers and
more software would continue to cut costs and improve
operations.
But there are limits, some of which are inherent
in the nature of software itself. Software is code,
lines and lines of code that runs sequentially. Building
software programs entails accumulating more
and more code. Much of the seemingly boundless
complexity of enterprise software is founded on
conditional branching (if-then statements) and a
hierarchy of interacting objects, all of which manipulate
information in a logical succession of
small steps. Each step contains explicit instructions.
To build software, programmers routinely
break down processes into discrete steps, effectively
systematizing and standardizing how work is done.
An entire sequence of such instructions works
more like a calculator than a “thinking machine.”
Thus the so-called intelligence of digital technology
arises not through magic, nor, in more
contemporary terms, through some emergent or
self-organizing principle, as some would believe.
The result is not greater than the sum of the parts.
Rather, it’s more akin to Adam Smith’s division of
labor and Frederick Taylor’s scientific management,
a process dependent on relentless analysis
and rationalization of the work to be done.
General software programming used in enterprise
systems may contain intricate branching and
handle a huge number of conditions, all of which
allows it to control a certain amount of complexity.
It does not, however, tolerate ambiguity,
inconsistencies or illogical conclusions. To be sure,
there are fuzzy logic programs, dynamic simulations,
genetic algorithms and neural nets with
subtler powers, but a vast amount of software
working in today’s large organizations is not of
these more advanced types. In fact, enterprise software
systems are more likely to succeed at relatively
straightforward tasks such as procurement and
order processing. As the problems get more complex,
so does the software that solves them. It is
estimated that for every 25% increase in complexity
in the tasks to be automated, the complexity of
the software solution itself rises by 100%.5
Business users and management inevitably want
changes in their automated processes as their needs
and markets evolve. And they expect to be able to
customize their software to fit their own needs.
“Software is infinitely malleable,” says computer
historian Martin Campbell-Kelly.6 This is in theory
true; however, as enterprise software becomes increasingly
comprehensive and complex, the costs
and risks involved in changing it increase as well.
No single person within an organization could
possibly know how a change in one part of the software
will affect its functioning elsewhere.
Software’s supposed flexibility and unending
ability to manage complexity contributed to the
discrepancies between the great expectations and
mediocre reality that plagued the first round of
implementations of enterprise resource planning
systems. In the middle to late 1990s, U.S. corporations
rushed to purchase and install such systems.
These systems — Germany-based SAP Aktiengesellschaft’s
is the most common — promised to
eliminate the complexity of multiple operating systems
and applications by replacing them with a
single set of interconnected modules to run the financial,
manufacturing, human resources and
other major functions of a typical multinational
corporation. Theoretically, a single monolithic system
would seamlessly connect various distinct and
geographically separate locations through private
CoNT R A R I A
From The Wall Street Journal
Business Technology Blog
http://blogs.wsj.com/biztech/
“Technology is supposed to simplify business. This has been true from
the Industrial Revolution to the Internet age. But did the large software
applications that were supposed to streamline large companies instead
irrevocably slow them down?
There’s a compelling argument to be made that they have. The average
company spends $15 million on Enterprise Resource Planning
software, the monolithic systems of record from vendors like SAP and
Oracle, and many large companies have spent tens and even hundreds
of times that, according to [Ms. Rettig’s article].
Some of this resonates. Certainly, companies that have tried to customize
these systems to reflect their own customized processes have
spent a lot of time and money to do so. And ERP systems do introduce a
certain amount of rigidity. On the flip side, having a system of record is a
benefit in and of itself that shouldn’t be discounted.” — Ben Worthen
sloanreview .mit .edu FALL 2007 MIT SLOAN MANAGEMENT REVIEW 23
networks. Companies understood that they could
customize these systems as needed to suit their
unique business processes.
That was the hope. But these massive programs,
with millions of lines of code, thousands of installation
options and countless interrelated pieces,
introduced new levels of complexity, often without
eliminating the older systems (known as
“legacy” systems) they were designed to replace. In
addition, concurrent technological and business
changes made closed ERP systems organized
around products less than a perfect solution: Just
as companies were undertaking multiyear ERP
implementations, the Internet was evolving into a
major new force, changing the way companies
transacted business with their customers, suppliers
and partners. At the same time, businesses were
realizing that organizing their information around
customers and services — and using newly available
customer relationship management systems
— was critical to their success.
The concept of a single monolithic system failed
for many companies. Different divisions or facilities
often made independent purchases, and other
systems were inherited through mergers and acquisitions.
Thus, many companies ended up having
several instances of the same ERP systems or a variety
of different ERP systems altogether, further
complicating their IT landscape. In the end, ERP
systems became just another subset of the legacy
systems they were supposed to replace.
The Costs of Implementation
ERP systems were expensive, too, costing companies
more than they had ever paid for software
when costs had been based on per-workstation
usage. But that price tag was dwarfed by the installation
charges, because companies had to
hire brigades of outside consultants, often for a
number of years, to actually get the software up
and running. While the average installation cost
$15 million, large organizations ended up spending
hundreds of millions of dollars. Even such
large expenditures did not guarantee success,
however. In fact, 75% of ERP implementations
were considered failures.7
Try as they might to measure the productivity
gains of ERP implementations or IT in general, researchers
have yet to arrive at any coherent or
consistent conclusions. One problem is that there
is little statistical evidence, especially about
whether the benefits of ERP implementations outweigh
the costs and risks. Researchers have even
suggested that ERP implementations are so difficult
that those companies that actually complete
them with relative success gain a competitive advantage
in the marketplace.8 It seems that ERPs,
which had looked like the true path to revolutionary
business process reengineering, introduced so
many complex, difficult technical and business issues
that just making it to the finish line with one’s
shirt on was considered a win.
All that complexity and all those options created
another conundrum. As Nicholas Carr famously
pointed out in his book, Does IT Matter? Information
Technology and the Corrosion of Competitive
Advantage,9 simply implementing the plain-vanilla
business processes that your competitors have does
not provide any competitive advantage. On the
other hand, as many companies learned the hard
way, customizing the already complex ERP software
created yet more complexity and even larger risks.
From Rough Type: Nicholas Carr’s Blog
http://www.roughtype.com/
“Over the last two decades, companies have plowed many billions of
dollars into enterprise resource planning (ERP) systems and the hardware
required to run them. But what, in the long run, will be the legacy
of ERP? Will it be viewed as it has been promoted by its marketers: as a
milestone in business automation that allowed companies to integrate
their previously fragmented information systems and simplify their data
flows? Or will it be viewed as a stopgap that largely backfired by tangling
companies in even more systems complexity and even higher IT costs?
In ‘The Trouble with Enterprise Software,’ Cynthia Rettig deftly lays out
the case for the latter view. Enterprise systems, argues Rettig, not only
failed to deliver on their grand promise, but often simply aggravated the
problems they were supposed to solve. Different divisions or facilities
often made independent purchases, and other systems were inherited
through mergers and acquisitions. In the end, ERP systems became just
another subset of the legacy systems they were supposed to replace.
So what’s the solution? Rettig doesn’t offer one, beyond suggesting
that top executives do more to educate themselves about the
problem and to work more closely with their CIOs. That may be good
advice, but it hardly addresses the underlying technical challenge. But
Rettig nevertheless has provided a valuable service with her article.
While some will argue that her indictment is at times overstated, she
makes a compelling case that the traditional approach to corporate
computing has become a dead end. We need to set a new course.”
— Nicholas Carr
24 MIT SLOAN MANAGEMENT REVIEW FALL 2007
CoNT R A R I A
sloanreview .mit .edu
Without intimate knowledge of how the integrated
pieces of these modular software packages actually
worked, customizing could lead to in-house bugs
and glitches that were hard to foresee and expensive
to fix. Perhaps even worse, customization made
changing the software later — or upgrading to a
newer version — far more difficult, and in some
cases prohibitively expensive. Christopher Koch,
executive editor of CIO, tells the story of one head
of a corporate SAP installation group who bragged
that he had his installation time down to a mere
three months for various facilities around the world:
“It didn’t matter that he was honing his skills on a
10-year-old version of the software because the
costs of upgrading are so huge — tens, even hundreds
of millions of dollars, or as much as it cost to
install the stuff in the first place — that he keeps installing
old versions of the software so that it will
line up with the old software they already have.”10
Unexpected bugs present another type of difficulty
that increases with complexity. Robert
Pool, technology journalist and author of Beyond
Engineering, explains it this way: “It’s possible to
go through a program line by line and make sure
that each individual instruction makes sense but
it is not possible to guarantee that the program as
a whole has no flaws.”11 The average professional
coder makes 100 to 150 errors for every 1,000 lines
of code, according to a Carnegie Mellon study
conducted by Watts Humphrey.12 That means for
every million lines of code there would be 500,000
mistakes. Software developers do extensive testing
on the paths users seem likely to take and correct
many of these errors. Nevertheless, they cannot
test or even anticipate every possible usage path,
so released software inevitably contains unknown
defects. “Civilization depends on software. So although
much software code is poorly written, you
can’t just stop the world to fix it,” says Bjarne
Stroustrup, the Danish-born computer scientist
who designed the popular C++ programming
language. On the other hand, Stroustrup does
concede that “muddling along is expensive, dangerous,
and depressing.”13
The Vagaries of Data
The data that software processes and generates is
another constant and growing problem. Estimates
of errors are astoundingly high. Single systems can
have error rates of 50% or more from myriad
sources — everything from mistyped data to stale
information to data placed in the wrong fields
within the database structure. But the really nasty,
intractable data problems erupt when companies
integrate multiple data sources, as was necessary
for ERP implementations, so that they could have
all their product, inventory and production records
stored in one place. Because of differing
formats, conventions, abbreviations and so on,
such integrations can result in a 100 or more records
that actually point to a single product or
customer. In the case of enterprise system implementations,
data problems alone precipitate many
of the failures perceived by business users and
much of the added expense as well. Overwhelmed
by the sheer difficulty and complexity of the new
software itself, companies literally “forgot about
the data,” as executive John Nicoli of Trillium Software
in Billerica, Massachusetts, describes it, until
the tail end of the project, thereby necessitating
enormous reworking to properly clean up and integrate
the data.14 And with corporate data stores
From Andrew McAfee’s Blog
http://blog.hbs.edu/faculty/amcafee/
“It is certainly true that enterprise systems have failed in many companies,
and it’s also true that, as [Ms. Rettig] points out, many others have not
been able to shut off legacy systems to the extent they expected after ERP
went live. But it is simply not the case that researchers have been unable
to draw any coherent conclusions about these technologies.
Rettig’s argument falls into a long line of pessimistic writing about the
value of corporate IT. Much of this writing takes the implicit, and at times
explicit, view that the executives who make technology decisions are
dupes, perennially falling for a ‘triumphant vision’ of software. The only
way I can see for the IT pessimists to be right is if the delusion about IT’s
benefits is both persistent and virtually universal. And I don’t buy that.
‘ERP doesn’t help’ is a testable hypothesis, and some colleagues
of mine have tested it. NYU’s Sinan Aral, Georgia Tech’s D.J. Wu, and
my friend and coauthor Erik Brynjolfsson at MIT recently published a
wonderful paper, “Which Came First, IT or Productivity? Virtuous Cycle
of Investment and Use in Enterprise Systems” (http://papers.ssrn.com/
sol3/papers.cfm?abstract_id=942291). [This paper] contains a vital insight:
If IT were not delivering value, rational decision makers would not
keep investing in it.
I agree that it’s important not to naively accept anyone’s triumphant
vision of corporate IT. But it’s also important not to make claims in the
other direction that are too sweeping. Perhaps most fundamentally, it’s
critical at some point to stop floating hypotheses about IT’s impact (or
lack thereof), and to start testing them.” — Andrew McAfee
sloanreview .mit .edu FALL 2007 MIT SLOAN MANAGEMENT REVIEW 25
doubling every three years, such data issues are
only compounding.
Is enterprise software just too complex to deliver
on its promises? After all, enterprise systems
were supposed to streamline and simplify business
processes. Instead, they have brought high
risks, uncertainty and a deeply worrying level of
complexity. Rather than agility they have produced
rigidity and unexpected barriers to change,
a veritable glut of information containing myriad
hidden errors, and a cloud of questions regarding
their overall benefits. Leaders in computer science
are clearly worried. “Complexity is death,” says
Chuck Thacker, one of 16 technical fellows at Microsoft
Corp. “We are hanging on with our
fingertips right now.”15
Business executives, however, simply want to
continue to believe that technology will lower
costs, improve processes and reduce the size of the
workforce. They don’t want to understand IT issues.
In part, this is because technology requires
special skills and intellectual talents that are quite
distinct from those needed to understand and
manage business organizations, markets and
strategy. But it is also because executives do not
like to hear about the downside of technology.
Observes Jim Shepherd, senior vice-president of
Boston-based AMR Research Inc., “Senior managers
often don’t particularly want to be told that
there’s a high risk and that there’s a great deal of
expenditure involved in minimizing it.”16 Yet the
only sure thing about new technologies and the
changes they introduce is their uncertainty. In
summarizing decades of research into technological
change, MIT Sloan School of Management’s
Wanda Orlikowski and the National Science
Foundation’s Suzanne Iacono conclude that
changes involving technology are both “profoundly
complex and uncertain.”17
For their part, CIOs and their managers rate aligning
IT with business strategy as their number one
priority. They struggle year after year to prove the
value of IT to the business side of the organization.
Yet the cost overruns, delays and outright failures of
enterprise systems have if anything widened the digital
divide between IT and the executive suite.
The Next New Thing
The proposed fix for these problems — the next
new thing — is service-oriented architecture. Basically,
SOA proposes to overcome the problems
involved with updating and changing legacy systems
by building modular cross-system business
processes. These processes would connect the relevant
pieces of functionality from various IT
systems, thereby making it easier to change processes
to adapt to new business goals. But technical
realists point out that many difficult technical
problems must be solved before SOA can become
the backbone for a new strategic architecture, including
robust protocols for accessing the
applications, high quality integrated data stores
and a sound methodology for managing the overall
process. Researchers Ross, Weill and Robertson
admit that most companies are in the early stages
of a four-part transformation to SOA that may
take many years — even decades — to realize.18
The estimates of how long this will take reflect a
growing acknowledgment of just how deep and
From the ZDNet.com Blog
http://blogs.zdnet.com/service-oriented/?p=938
This article, originally published on the MIT Sloan Management Review website, stirred up
a good deal of discussion in the blogosphere. A sampling of that crosstalk is included here.
“There’s really nothing new in [Ms. Rettig’s] analysis. But Rettig goes a
step further and says there’s no hope for the future. In fact, while she
doesn’t offer any remedies for her gloomy prognosis, she does quash
one — service-oriented architecture (SOA).
Rettig doesn’t offer any encouraging words about SOA as an ERP
workaround. SOA may take years to come to full fruition, not in enough
time to help beleaguered companies, she says. And SOA may simply be
too slow to keep up the dynamic business environments of today. Not to
mention technical challenges. Rettig says that SOA increases complexity,
as it becomes “additional layers of code superimposed on the
existing layers,” and she doesn’t buy the Lego-block concept that underpins
much of the thinking about SOA.
Let’s put it this way: aside from SOA, what is the alternative? No one
is willing, or can afford to, to stay with the rigid, stovepiped systems in
their current form. One solution is just throw the entire mess out, and
buy a huge, well-integrated, modular application. But no one has the
time or budgets. The only workable approach, then, is gradual integration
between systems, and gradual, greater agility — if not through SOA,
then how? SOA, pure and simple, is the first step to software industrialization
— creating massive, adaptable systems in an automated and
modular fashion through greater economies of scale. ERP was a step in
this direction, since it modularized, and brought many vital pieces of
the business together into a single standardized system. SOA takes it to
the next level, beyond the domain offered by a single vendor. That’s the
core value proposition of SOA. — Joe McKendrick
26 MIT SLOAN MANAGEMENT REVIEW FALL 2007
CoNT R A R I A
sloanreview .mit .edu
radical are the organizational changes these technological
innovations mandate. It is a process of
adoption and adaptation that by definition cannot
occur overnight. Nor, conclude the researchers,
can companies skip a step. Given that only 6% of
companies have made it into the later stages, this
model would suggest that companies are in for a
long haul if they are to escape the tangle of technological
complexity inherent in large organizations
today, and it will be a journey fraught with cultural
as well as technical problems.
The timeline itself for this kind of transformation
may just be too long to be realistically
sustainable and successful. The dynamic business
environments of today, where whole industries and
markets can undergo radical changes in a matter of
a few years and the horizon for corporate strategies
has shrunk from 10 years to three to five, makes it
questionable whether companies can actually
maintain a focused strategy long enough to align
their core business processes with IT.
Technical problems raise additional questions
about the feasibility of such an undertaking. The
hallmark of service-oriented architecture — one
might reasonably argue its entire raison d’etre — is
the fundamental modularity of its software business
processes. A self-contained business process adopts
parts of the functionality from multiple enterprise
applications to automatically complete a set of tasks.
For example, a single business process might begin
with an order from a customer on the Internet in a
web services system and send it to manufacturing in
an ERP system. The same business process would set
up delivery in a logistics system and then send all the
relevant information to billing in an accounting system
as well as a customer relationship management
system. Companies would build (or purchase) business
modules for their core processes. They would
then be able to easily change these processes, snapping
out and in functional pieces of code from
enterprise systems in Lego-like fashion.
The Lego dream has been a persistent favorite
among a generation or more of programmers who
grew up with those construction toys. Unfortunately,
however, software does not work as Legos do.
For one thing, a unit of software code is not similar
to other software code in terms of scale or functionality,
as Legos are.19 On the contrary, code is widely
various and heterogeneous. It contains different
numbers and types of connections to other code,
more like fractals, as Victoria University of Wellington
researchers James Noble and Robert Biddle
describe it, than Legos, with their uniform connections.
Software engineering expert Robert Glass sees
another problem with the Legos idea: The notion of
reusable software works on a small scale. Programmers
have successfully built and reused subroutines
of standard functions. But as software grows more
complex, reusability becomes a difficult or impossible
task. “It is simply a problem too hard to be
solved, a problem rooted in software’s diversity.”20
“Complexity is a deadly software killer,” says Yale
computer scientist David Gelernter, and he argues that
managing complexity is more of an art than a science,
and a difficult one at that, especially given the monumental
real-world systems today’s software attempts
to automate.21 And to the extent that these service-oriented
architectures use subsets of code from within
ERP and other enterprise systems, they do not escape
the mire of complexity built over the past 15 years or
so. Rather, they carry it along with them, incorporating
code from existing applications into a fancy new
remix. SOAs become additional layers of code superimposed
on the existing layers. That means it is possible
that a process will fail at some point due to some fault
in the layers below, and in order to understand and fix
that problem, software engineers will need to deal with
the layers of enterprise applications below the modular
business processes.
From the Deal Architect Blog
http://www.dealarchitect.typepad.com/
“The good news is [Ms. Rettig’s] article will get executive attention. Not
that they do not know. I recently met an executive at a client about to
start an ERP implementation. He sounded like a man headed to the gallows.
Nervous, not excited about the project. (That afternoon, I felt
really embarrassed for our industry that after 100K+ ERP projects, we still
cannot make it a no-brainer.)
But it is way past talking about messes. Companies are in various stages
of ERP hangover management, not always looking at software as a service
(SaaS), as those vendors would have you believe — it’s not that easy to rip
and replace a backbone ERP solution — but software as a customized service
(SaCS). [Those companies are in] aggressive re-negotiation of ERP
maintenance contracts or moving to third party maintenance.
The only ones who do not seem to realize the party is over are the
vendors, who are using service-oriented architecture (SOA), compliance
and more low payback justifiers to extend the run.
— Vinnie Mirchandani
sloanreview .mit .edu FALL 2007 MIT SLOAN MANAGEMENT REVIEW 27
Culturally, this long-term plan calls for closer and
closer communication and collaboration between
the IT and business sides of the organization. While
much to be desired, this has proved difficult in the
past, and with increasing complexity in software systems,
it is unlikely to improve by itself in the future.
Differing backgrounds and perspectives, goals, even
vocabularies — all hamper efforts to improve communication
across this internal digital divide. Biases
intrude: A recent study by Forrester Research Inc. of
Cambridge, Massachusetts, found that only 28% of
CEOs thought their CIOs were proactive or creative
in terms of business process improvement.22 Forrester’s
advice to CIOs is to get more deeply involved
in the business issues and educate executives on what
IT is and what it actually does.
Sound advice, no doubt, but it may be time for
business executives themselves to become more
proactive. Executives could educate themselves
more about technology. They could send promising
younger executives to executive programs
designed to teach business people how to better
understand, communicate with and capitalize on
their IT. And business schools, too, could do better
at teaching the interdependence of business and IT.
At present, however, corporations see in software’s
seductive invisibility and seemingly open-ended
flexibility a never-ending frontier of promise,
where hope triumphs over reality and the search
for the next new thing trumps addressing difficult
existing problems. And hope, unfortunately, has
never been a very effective strategy.
Cynthia Rettig was director of knowledge management
for B2B consulting company Canopy International of
Newton, Massachusetts. She has consulted to software
companies for over 20 years. She can be reached at
[email protected].
REFERENCES
1. T. Friedman, “The World Is Flat: A Brief History of the
Twenty-First Century” (New York: Farrar, Straus and Giroux,
2005), 128.
2. J. Dedrick, V. Gurbaxani, and K.L. Kraemer, “Information
Technology and Economic Performance: A Critical
Review of the Empirical Evidence,” ACM Computing
Surveys 35, no. 1 (March 2003): 18.
3. J.W. Ross, P. Weill, and D.C. Robertson, “Enterprise
Architecture As Strategy: Creating a Foundation for
Business Execution” (Boston: Harvard Business School
Press, 2006), 11.
4. J.W. Cordata, “Progenitors of the Information Age:
The Development of Chips and Computers,” in “A Nation
Transformed By Information,” eds. A.D. Chandler and
J.W. Cordata (New York: Oxford University Press, 2000),
206-208.
5. R.L. Glass, “Facts and Fallacies of Software Engineering”
(Boston: Pearson Education, 2003), 58.
6. M. Campbell-Kelly, “From Airline Reservations to
Sonic the Hedgehog: A History of the Software Industry”
(Cambridge: MIT Press, 2004), 198.
7. K-K Hong, Y-G Kim, “The Critical Success Factors for
ERP Implementation: An Organizational Fit Perspective,”
Information & Management 40, no. 1 (October
2002): 25.
8. L.M. Hitt, D.J. Wu and X. Zhou, “Investment in Enterprise
Resource Planning: Business Impact and
Productivity Measures,” Journal of Management Information
Systems 19, no. 1 (summer 2002): 71-98.
9. N. Carr, “Does IT Matter? Information Technology and
the Corrosion of Competitive Advantage,” (Boston: Harvard
Business School Press, 2004).
10. C. Koch, “The Monopoly That Matters More Than
Microsoft,” blog in CIO Magazine, Nov. 13, 2006, http://
advice.cio.com/the-monopoly-that-matters-more-thanmicrosoft.
11. R. Pool, “Beyond Engineering: How Society Shapes
Technology” (New York: Oxford University Press, 1997),
137.
12. C. Mann, “Why Software Is So Bad,” Technology
Review (July/August 2002).
13. J. Pontin, “Bjarne Stroustrup: The Problem With Programming,”
Technology Review (January-February
2007): 22.
14. J. Nicoli, interview with author at Trillium Software
(Billerica, Massachusetts), August 8, 2005.
15. S. Rosenberg, “Anything You Can Do, I Can Do
Meta,” Technology Review (January-February 2007): 45.
16. M. Wheatley, “ERP Training Stinks,” CIO Magazine
(June 1, 2000).
17. W. Orlikowski and C.S. Iacono, “The Truth Is Not Out
There: An Enacted View of the ‘Digital Economy,’” in
“Understanding the Digital Economy: Data, Tools and
Research,” eds. E. Brynjolfsson and B. Kahin, (Cambridge:
MIT Press, 2000), 355.
18. Ross, “Enterprise Architecture As Strategy.”
19. S. Rosenberg, “Dreaming in Code: Two Dozen Programmers,
Three Years, 4,732 Bugs, and One Quest for
Transcendent Software” (New York: Crown Publishers,
2007), 94-95.
20. Glass, “Facts and Fallacies of Software Engineering.”
21. D. Gelernter, “Mirror Worlds: Or the Day Software
Puts the Universe in a Shoebox … How It Will Happen
and What It Will Mean” (New York: Oxford University
Press, 1992), 51.
22. S. Shay, “CEOs Rate IT: Steady But Uncreative,”
CIO Magazine (April 1, 2007): 20.
Reprint 49101. For ordering information, see page 1.
Copyright © Massachusetts Institute of Technology, 2007.
All rights reserved.
Technology has alwa
y s b e en a b out
hope. Since the beginning
of the industrial
revolution, businesses
have embraced new
technologies enthusiastically,
and their
optimism has been
rewarded with improved
proce s s e s ,
lower costs and reduced
workforces. As the pace of technological
innovation has intensified over the past two decades,
businesses have come to expect that the next
new thing will inevitably bring them larger market
opportunities and bigger profits. Software, a technology
so invisible and obscure to most of us that it
appears to work like magic, especially lends itself to
this kind of open-ended hope.
Software promises evolutions, revolutions and
even transformations in how companies do business.
The triumphant vision many buy into is that
enterprise software in large organizations is fully
integrated and intelligently controls infinitely complex
business processes while remaining flexible
enough to adapt to changing business needs. This
vision of software lies at the core of what Thomas
Friedman in The World Is Flat calls “the Wal-Mart
Symphony in multiple movements — with no finale.
It just plays over and over 24/7/365.”1 Whole
systems march in lock step, providing synchronized,
fully coordinated supply chains, production
lines and services, just like a world class orchestra.
From online web orders through fulfillment, delivery,
billing and customer service — the entire
enterprise, organized end to end — that has been
the promise. The age of smart machines would
seem to be upon us.
Or is it? While a few companies like Wal-Mart
Stores Inc. have achieved something close to that
ideal, the way most large organizations actually
process information belies that glorious vision and
reveals a looking-glass world, where everything is
in fact the opposite of what one might expect. Back
office systems — including both software applications
and the data they process — are a variegated
patchwork of systems, containing 50 or more databases
and hundreds of separate software programs
installed over decades and interconnected by idiosyncratic,
Byzantine and poorly documented
customized processes. To manage this growing
complexity, IT departments have grown substantially:
As a percentage of total investment, IT rose
from 2.6% to 3.5% between 1970 and 1980.2 By
1990 IT consumed 9%, and by 1999 a whopping
22% of total investment went to IT. Growth in IT
spending has fallen off, but it is nonetheless surprising
to hear that today’s IT departments spend
70% to 80% of their budgets just trying to keep existing
systems running.
According to a multiyear study of over 400 companies
by MIT researchers Jeanne Ross, Peter Weill
and David Robertson,3 IT departments tend not to
be innovative leaders within organizations, but
rather conservative forces, viewed by business executives
as cost sinks and liabilities. In many
companies, it takes the IT department one to two
years to implement a new strategic initiative --
hardly the agility companies are striving for. The
research shows the typical IT structure is so dense
and extensive that it’s often a miracle that it works
at all. The researchers observe: “Legacy systems
cobbled together to respond to each new business
initiative create rigidity and excessive costs. Every
change becomes a risky, expensive venture.”
The Proliferation of Complexity
How did this happen? James Cordata, who has
written extensively about the information economy,
points out that as work became more complex
and specialized over the 20th century, the use of
data — numbers and facts — as fodder for more
and more analysis and fact-based decision making
intensified. And digital technology “was perfect for
this kind of world.”4 Of course, digital technology
The Trouble With Enterprise Software
Has enterprise
software become
too complex to
be effective?
Cynth ia Rett ig
Illustration: TK/theispot.com FALL 2007 MIT SLOAN MANAGEMENT REVIEW 21
This article, originally
published on the
MIT Sloan Management
Review website,
stirred up a good deal
of discussion in the
blogosphere, a sampling
of which is included in
the following pages.
22 MIT SLOAN MANAGEMENT REVIEW FALL 2007 sloanreview .mit .edu
not only supported that complexity but also played
a large part in actually creating it, weaving a continuous
web of unending data. “More computers
are better than fewer” remains a key belief of American
business, Cordata says. “There are no limits to
how much is good.” Management became accustomed
to the idea that buying more computers and
more software would continue to cut costs and improve
operations.
But there are limits, some of which are inherent
in the nature of software itself. Software is code,
lines and lines of code that runs sequentially. Building
software programs entails accumulating more
and more code. Much of the seemingly boundless
complexity of enterprise software is founded on
conditional branching (if-then statements) and a
hierarchy of interacting objects, all of which manipulate
information in a logical succession of
small steps. Each step contains explicit instructions.
To build software, programmers routinely
break down processes into discrete steps, effectively
systematizing and standardizing how work is done.
An entire sequence of such instructions works
more like a calculator than a “thinking machine.”
Thus the so-called intelligence of digital technology
arises not through magic, nor, in more
contemporary terms, through some emergent or
self-organizing principle, as some would believe.
The result is not greater than the sum of the parts.
Rather, it’s more akin to Adam Smith’s division of
labor and Frederick Taylor’s scientific management,
a process dependent on relentless analysis
and rationalization of the work to be done.
General software programming used in enterprise
systems may contain intricate branching and
handle a huge number of conditions, all of which
allows it to control a certain amount of complexity.
It does not, however, tolerate ambiguity,
inconsistencies or illogical conclusions. To be sure,
there are fuzzy logic programs, dynamic simulations,
genetic algorithms and neural nets with
subtler powers, but a vast amount of software
working in today’s large organizations is not of
these more advanced types. In fact, enterprise software
systems are more likely to succeed at relatively
straightforward tasks such as procurement and
order processing. As the problems get more complex,
so does the software that solves them. It is
estimated that for every 25% increase in complexity
in the tasks to be automated, the complexity of
the software solution itself rises by 100%.5
Business users and management inevitably want
changes in their automated processes as their needs
and markets evolve. And they expect to be able to
customize their software to fit their own needs.
“Software is infinitely malleable,” says computer
historian Martin Campbell-Kelly.6 This is in theory
true; however, as enterprise software becomes increasingly
comprehensive and complex, the costs
and risks involved in changing it increase as well.
No single person within an organization could
possibly know how a change in one part of the software
will affect its functioning elsewhere.
Software’s supposed flexibility and unending
ability to manage complexity contributed to the
discrepancies between the great expectations and
mediocre reality that plagued the first round of
implementations of enterprise resource planning
systems. In the middle to late 1990s, U.S. corporations
rushed to purchase and install such systems.
These systems — Germany-based SAP Aktiengesellschaft’s
is the most common — promised to
eliminate the complexity of multiple operating systems
and applications by replacing them with a
single set of interconnected modules to run the financial,
manufacturing, human resources and
other major functions of a typical multinational
corporation. Theoretically, a single monolithic system
would seamlessly connect various distinct and
geographically separate locations through private
CoNT R A R I A
From The Wall Street Journal
Business Technology Blog
http://blogs.wsj.com/biztech/
“Technology is supposed to simplify business. This has been true from
the Industrial Revolution to the Internet age. But did the large software
applications that were supposed to streamline large companies instead
irrevocably slow them down?
There’s a compelling argument to be made that they have. The average
company spends $15 million on Enterprise Resource Planning
software, the monolithic systems of record from vendors like SAP and
Oracle, and many large companies have spent tens and even hundreds
of times that, according to [Ms. Rettig’s article].
Some of this resonates. Certainly, companies that have tried to customize
these systems to reflect their own customized processes have
spent a lot of time and money to do so. And ERP systems do introduce a
certain amount of rigidity. On the flip side, having a system of record is a
benefit in and of itself that shouldn’t be discounted.” — Ben Worthen
sloanreview .mit .edu FALL 2007 MIT SLOAN MANAGEMENT REVIEW 23
networks. Companies understood that they could
customize these systems as needed to suit their
unique business processes.
That was the hope. But these massive programs,
with millions of lines of code, thousands of installation
options and countless interrelated pieces,
introduced new levels of complexity, often without
eliminating the older systems (known as
“legacy” systems) they were designed to replace. In
addition, concurrent technological and business
changes made closed ERP systems organized
around products less than a perfect solution: Just
as companies were undertaking multiyear ERP
implementations, the Internet was evolving into a
major new force, changing the way companies
transacted business with their customers, suppliers
and partners. At the same time, businesses were
realizing that organizing their information around
customers and services — and using newly available
customer relationship management systems
— was critical to their success.
The concept of a single monolithic system failed
for many companies. Different divisions or facilities
often made independent purchases, and other
systems were inherited through mergers and acquisitions.
Thus, many companies ended up having
several instances of the same ERP systems or a variety
of different ERP systems altogether, further
complicating their IT landscape. In the end, ERP
systems became just another subset of the legacy
systems they were supposed to replace.
The Costs of Implementation
ERP systems were expensive, too, costing companies
more than they had ever paid for software
when costs had been based on per-workstation
usage. But that price tag was dwarfed by the installation
charges, because companies had to
hire brigades of outside consultants, often for a
number of years, to actually get the software up
and running. While the average installation cost
$15 million, large organizations ended up spending
hundreds of millions of dollars. Even such
large expenditures did not guarantee success,
however. In fact, 75% of ERP implementations
were considered failures.7
Try as they might to measure the productivity
gains of ERP implementations or IT in general, researchers
have yet to arrive at any coherent or
consistent conclusions. One problem is that there
is little statistical evidence, especially about
whether the benefits of ERP implementations outweigh
the costs and risks. Researchers have even
suggested that ERP implementations are so difficult
that those companies that actually complete
them with relative success gain a competitive advantage
in the marketplace.8 It seems that ERPs,
which had looked like the true path to revolutionary
business process reengineering, introduced so
many complex, difficult technical and business issues
that just making it to the finish line with one’s
shirt on was considered a win.
All that complexity and all those options created
another conundrum. As Nicholas Carr famously
pointed out in his book, Does IT Matter? Information
Technology and the Corrosion of Competitive
Advantage,9 simply implementing the plain-vanilla
business processes that your competitors have does
not provide any competitive advantage. On the
other hand, as many companies learned the hard
way, customizing the already complex ERP software
created yet more complexity and even larger risks.
From Rough Type: Nicholas Carr’s Blog
http://www.roughtype.com/
“Over the last two decades, companies have plowed many billions of
dollars into enterprise resource planning (ERP) systems and the hardware
required to run them. But what, in the long run, will be the legacy
of ERP? Will it be viewed as it has been promoted by its marketers: as a
milestone in business automation that allowed companies to integrate
their previously fragmented information systems and simplify their data
flows? Or will it be viewed as a stopgap that largely backfired by tangling
companies in even more systems complexity and even higher IT costs?
In ‘The Trouble with Enterprise Software,’ Cynthia Rettig deftly lays out
the case for the latter view. Enterprise systems, argues Rettig, not only
failed to deliver on their grand promise, but often simply aggravated the
problems they were supposed to solve. Different divisions or facilities
often made independent purchases, and other systems were inherited
through mergers and acquisitions. In the end, ERP systems became just
another subset of the legacy systems they were supposed to replace.
So what’s the solution? Rettig doesn’t offer one, beyond suggesting
that top executives do more to educate themselves about the
problem and to work more closely with their CIOs. That may be good
advice, but it hardly addresses the underlying technical challenge. But
Rettig nevertheless has provided a valuable service with her article.
While some will argue that her indictment is at times overstated, she
makes a compelling case that the traditional approach to corporate
computing has become a dead end. We need to set a new course.”
— Nicholas Carr
24 MIT SLOAN MANAGEMENT REVIEW FALL 2007
CoNT R A R I A
sloanreview .mit .edu
Without intimate knowledge of how the integrated
pieces of these modular software packages actually
worked, customizing could lead to in-house bugs
and glitches that were hard to foresee and expensive
to fix. Perhaps even worse, customization made
changing the software later — or upgrading to a
newer version — far more difficult, and in some
cases prohibitively expensive. Christopher Koch,
executive editor of CIO, tells the story of one head
of a corporate SAP installation group who bragged
that he had his installation time down to a mere
three months for various facilities around the world:
“It didn’t matter that he was honing his skills on a
10-year-old version of the software because the
costs of upgrading are so huge — tens, even hundreds
of millions of dollars, or as much as it cost to
install the stuff in the first place — that he keeps installing
old versions of the software so that it will
line up with the old software they already have.”10
Unexpected bugs present another type of difficulty
that increases with complexity. Robert
Pool, technology journalist and author of Beyond
Engineering, explains it this way: “It’s possible to
go through a program line by line and make sure
that each individual instruction makes sense but
it is not possible to guarantee that the program as
a whole has no flaws.”11 The average professional
coder makes 100 to 150 errors for every 1,000 lines
of code, according to a Carnegie Mellon study
conducted by Watts Humphrey.12 That means for
every million lines of code there would be 500,000
mistakes. Software developers do extensive testing
on the paths users seem likely to take and correct
many of these errors. Nevertheless, they cannot
test or even anticipate every possible usage path,
so released software inevitably contains unknown
defects. “Civilization depends on software. So although
much software code is poorly written, you
can’t just stop the world to fix it,” says Bjarne
Stroustrup, the Danish-born computer scientist
who designed the popular C++ programming
language. On the other hand, Stroustrup does
concede that “muddling along is expensive, dangerous,
and depressing.”13
The Vagaries of Data
The data that software processes and generates is
another constant and growing problem. Estimates
of errors are astoundingly high. Single systems can
have error rates of 50% or more from myriad
sources — everything from mistyped data to stale
information to data placed in the wrong fields
within the database structure. But the really nasty,
intractable data problems erupt when companies
integrate multiple data sources, as was necessary
for ERP implementations, so that they could have
all their product, inventory and production records
stored in one place. Because of differing
formats, conventions, abbreviations and so on,
such integrations can result in a 100 or more records
that actually point to a single product or
customer. In the case of enterprise system implementations,
data problems alone precipitate many
of the failures perceived by business users and
much of the added expense as well. Overwhelmed
by the sheer difficulty and complexity of the new
software itself, companies literally “forgot about
the data,” as executive John Nicoli of Trillium Software
in Billerica, Massachusetts, describes it, until
the tail end of the project, thereby necessitating
enormous reworking to properly clean up and integrate
the data.14 And with corporate data stores
From Andrew McAfee’s Blog
http://blog.hbs.edu/faculty/amcafee/
“It is certainly true that enterprise systems have failed in many companies,
and it’s also true that, as [Ms. Rettig] points out, many others have not
been able to shut off legacy systems to the extent they expected after ERP
went live. But it is simply not the case that researchers have been unable
to draw any coherent conclusions about these technologies.
Rettig’s argument falls into a long line of pessimistic writing about the
value of corporate IT. Much of this writing takes the implicit, and at times
explicit, view that the executives who make technology decisions are
dupes, perennially falling for a ‘triumphant vision’ of software. The only
way I can see for the IT pessimists to be right is if the delusion about IT’s
benefits is both persistent and virtually universal. And I don’t buy that.
‘ERP doesn’t help’ is a testable hypothesis, and some colleagues
of mine have tested it. NYU’s Sinan Aral, Georgia Tech’s D.J. Wu, and
my friend and coauthor Erik Brynjolfsson at MIT recently published a
wonderful paper, “Which Came First, IT or Productivity? Virtuous Cycle
of Investment and Use in Enterprise Systems” (http://papers.ssrn.com/
sol3/papers.cfm?abstract_id=942291). [This paper] contains a vital insight:
If IT were not delivering value, rational decision makers would not
keep investing in it.
I agree that it’s important not to naively accept anyone’s triumphant
vision of corporate IT. But it’s also important not to make claims in the
other direction that are too sweeping. Perhaps most fundamentally, it’s
critical at some point to stop floating hypotheses about IT’s impact (or
lack thereof), and to start testing them.” — Andrew McAfee
sloanreview .mit .edu FALL 2007 MIT SLOAN MANAGEMENT REVIEW 25
doubling every three years, such data issues are
only compounding.
Is enterprise software just too complex to deliver
on its promises? After all, enterprise systems
were supposed to streamline and simplify business
processes. Instead, they have brought high
risks, uncertainty and a deeply worrying level of
complexity. Rather than agility they have produced
rigidity and unexpected barriers to change,
a veritable glut of information containing myriad
hidden errors, and a cloud of questions regarding
their overall benefits. Leaders in computer science
are clearly worried. “Complexity is death,” says
Chuck Thacker, one of 16 technical fellows at Microsoft
Corp. “We are hanging on with our
fingertips right now.”15
Business executives, however, simply want to
continue to believe that technology will lower
costs, improve processes and reduce the size of the
workforce. They don’t want to understand IT issues.
In part, this is because technology requires
special skills and intellectual talents that are quite
distinct from those needed to understand and
manage business organizations, markets and
strategy. But it is also because executives do not
like to hear about the downside of technology.
Observes Jim Shepherd, senior vice-president of
Boston-based AMR Research Inc., “Senior managers
often don’t particularly want to be told that
there’s a high risk and that there’s a great deal of
expenditure involved in minimizing it.”16 Yet the
only sure thing about new technologies and the
changes they introduce is their uncertainty. In
summarizing decades of research into technological
change, MIT Sloan School of Management’s
Wanda Orlikowski and the National Science
Foundation’s Suzanne Iacono conclude that
changes involving technology are both “profoundly
complex and uncertain.”17
For their part, CIOs and their managers rate aligning
IT with business strategy as their number one
priority. They struggle year after year to prove the
value of IT to the business side of the organization.
Yet the cost overruns, delays and outright failures of
enterprise systems have if anything widened the digital
divide between IT and the executive suite.
The Next New Thing
The proposed fix for these problems — the next
new thing — is service-oriented architecture. Basically,
SOA proposes to overcome the problems
involved with updating and changing legacy systems
by building modular cross-system business
processes. These processes would connect the relevant
pieces of functionality from various IT
systems, thereby making it easier to change processes
to adapt to new business goals. But technical
realists point out that many difficult technical
problems must be solved before SOA can become
the backbone for a new strategic architecture, including
robust protocols for accessing the
applications, high quality integrated data stores
and a sound methodology for managing the overall
process. Researchers Ross, Weill and Robertson
admit that most companies are in the early stages
of a four-part transformation to SOA that may
take many years — even decades — to realize.18
The estimates of how long this will take reflect a
growing acknowledgment of just how deep and
From the ZDNet.com Blog
http://blogs.zdnet.com/service-oriented/?p=938
This article, originally published on the MIT Sloan Management Review website, stirred up
a good deal of discussion in the blogosphere. A sampling of that crosstalk is included here.
“There’s really nothing new in [Ms. Rettig’s] analysis. But Rettig goes a
step further and says there’s no hope for the future. In fact, while she
doesn’t offer any remedies for her gloomy prognosis, she does quash
one — service-oriented architecture (SOA).
Rettig doesn’t offer any encouraging words about SOA as an ERP
workaround. SOA may take years to come to full fruition, not in enough
time to help beleaguered companies, she says. And SOA may simply be
too slow to keep up the dynamic business environments of today. Not to
mention technical challenges. Rettig says that SOA increases complexity,
as it becomes “additional layers of code superimposed on the
existing layers,” and she doesn’t buy the Lego-block concept that underpins
much of the thinking about SOA.
Let’s put it this way: aside from SOA, what is the alternative? No one
is willing, or can afford to, to stay with the rigid, stovepiped systems in
their current form. One solution is just throw the entire mess out, and
buy a huge, well-integrated, modular application. But no one has the
time or budgets. The only workable approach, then, is gradual integration
between systems, and gradual, greater agility — if not through SOA,
then how? SOA, pure and simple, is the first step to software industrialization
— creating massive, adaptable systems in an automated and
modular fashion through greater economies of scale. ERP was a step in
this direction, since it modularized, and brought many vital pieces of
the business together into a single standardized system. SOA takes it to
the next level, beyond the domain offered by a single vendor. That’s the
core value proposition of SOA. — Joe McKendrick
26 MIT SLOAN MANAGEMENT REVIEW FALL 2007
CoNT R A R I A
sloanreview .mit .edu
radical are the organizational changes these technological
innovations mandate. It is a process of
adoption and adaptation that by definition cannot
occur overnight. Nor, conclude the researchers,
can companies skip a step. Given that only 6% of
companies have made it into the later stages, this
model would suggest that companies are in for a
long haul if they are to escape the tangle of technological
complexity inherent in large organizations
today, and it will be a journey fraught with cultural
as well as technical problems.
The timeline itself for this kind of transformation
may just be too long to be realistically
sustainable and successful. The dynamic business
environments of today, where whole industries and
markets can undergo radical changes in a matter of
a few years and the horizon for corporate strategies
has shrunk from 10 years to three to five, makes it
questionable whether companies can actually
maintain a focused strategy long enough to align
their core business processes with IT.
Technical problems raise additional questions
about the feasibility of such an undertaking. The
hallmark of service-oriented architecture — one
might reasonably argue its entire raison d’etre — is
the fundamental modularity of its software business
processes. A self-contained business process adopts
parts of the functionality from multiple enterprise
applications to automatically complete a set of tasks.
For example, a single business process might begin
with an order from a customer on the Internet in a
web services system and send it to manufacturing in
an ERP system. The same business process would set
up delivery in a logistics system and then send all the
relevant information to billing in an accounting system
as well as a customer relationship management
system. Companies would build (or purchase) business
modules for their core processes. They would
then be able to easily change these processes, snapping
out and in functional pieces of code from
enterprise systems in Lego-like fashion.
The Lego dream has been a persistent favorite
among a generation or more of programmers who
grew up with those construction toys. Unfortunately,
however, software does not work as Legos do.
For one thing, a unit of software code is not similar
to other software code in terms of scale or functionality,
as Legos are.19 On the contrary, code is widely
various and heterogeneous. It contains different
numbers and types of connections to other code,
more like fractals, as Victoria University of Wellington
researchers James Noble and Robert Biddle
describe it, than Legos, with their uniform connections.
Software engineering expert Robert Glass sees
another problem with the Legos idea: The notion of
reusable software works on a small scale. Programmers
have successfully built and reused subroutines
of standard functions. But as software grows more
complex, reusability becomes a difficult or impossible
task. “It is simply a problem too hard to be
solved, a problem rooted in software’s diversity.”20
“Complexity is a deadly software killer,” says Yale
computer scientist David Gelernter, and he argues that
managing complexity is more of an art than a science,
and a difficult one at that, especially given the monumental
real-world systems today’s software attempts
to automate.21 And to the extent that these service-oriented
architectures use subsets of code from within
ERP and other enterprise systems, they do not escape
the mire of complexity built over the past 15 years or
so. Rather, they carry it along with them, incorporating
code from existing applications into a fancy new
remix. SOAs become additional layers of code superimposed
on the existing layers. That means it is possible
that a process will fail at some point due to some fault
in the layers below, and in order to understand and fix
that problem, software engineers will need to deal with
the layers of enterprise applications below the modular
business processes.
From the Deal Architect Blog
http://www.dealarchitect.typepad.com/
“The good news is [Ms. Rettig’s] article will get executive attention. Not
that they do not know. I recently met an executive at a client about to
start an ERP implementation. He sounded like a man headed to the gallows.
Nervous, not excited about the project. (That afternoon, I felt
really embarrassed for our industry that after 100K+ ERP projects, we still
cannot make it a no-brainer.)
But it is way past talking about messes. Companies are in various stages
of ERP hangover management, not always looking at software as a service
(SaaS), as those vendors would have you believe — it’s not that easy to rip
and replace a backbone ERP solution — but software as a customized service
(SaCS). [Those companies are in] aggressive re-negotiation of ERP
maintenance contracts or moving to third party maintenance.
The only ones who do not seem to realize the party is over are the
vendors, who are using service-oriented architecture (SOA), compliance
and more low payback justifiers to extend the run.
— Vinnie Mirchandani
sloanreview .mit .edu FALL 2007 MIT SLOAN MANAGEMENT REVIEW 27
Culturally, this long-term plan calls for closer and
closer communication and collaboration between
the IT and business sides of the organization. While
much to be desired, this has proved difficult in the
past, and with increasing complexity in software systems,
it is unlikely to improve by itself in the future.
Differing backgrounds and perspectives, goals, even
vocabularies — all hamper efforts to improve communication
across this internal digital divide. Biases
intrude: A recent study by Forrester Research Inc. of
Cambridge, Massachusetts, found that only 28% of
CEOs thought their CIOs were proactive or creative
in terms of business process improvement.22 Forrester’s
advice to CIOs is to get more deeply involved
in the business issues and educate executives on what
IT is and what it actually does.
Sound advice, no doubt, but it may be time for
business executives themselves to become more
proactive. Executives could educate themselves
more about technology. They could send promising
younger executives to executive programs
designed to teach business people how to better
understand, communicate with and capitalize on
their IT. And business schools, too, could do better
at teaching the interdependence of business and IT.
At present, however, corporations see in software’s
seductive invisibility and seemingly open-ended
flexibility a never-ending frontier of promise,
where hope triumphs over reality and the search
for the next new thing trumps addressing difficult
existing problems. And hope, unfortunately, has
never been a very effective strategy.
Cynthia Rettig was director of knowledge management
for B2B consulting company Canopy International of
Newton, Massachusetts. She has consulted to software
companies for over 20 years. She can be reached at
[email protected].
REFERENCES
1. T. Friedman, “The World Is Flat: A Brief History of the
Twenty-First Century” (New York: Farrar, Straus and Giroux,
2005), 128.
2. J. Dedrick, V. Gurbaxani, and K.L. Kraemer, “Information
Technology and Economic Performance: A Critical
Review of the Empirical Evidence,” ACM Computing
Surveys 35, no. 1 (March 2003): 18.
3. J.W. Ross, P. Weill, and D.C. Robertson, “Enterprise
Architecture As Strategy: Creating a Foundation for
Business Execution” (Boston: Harvard Business School
Press, 2006), 11.
4. J.W. Cordata, “Progenitors of the Information Age:
The Development of Chips and Computers,” in “A Nation
Transformed By Information,” eds. A.D. Chandler and
J.W. Cordata (New York: Oxford University Press, 2000),
206-208.
5. R.L. Glass, “Facts and Fallacies of Software Engineering”
(Boston: Pearson Education, 2003), 58.
6. M. Campbell-Kelly, “From Airline Reservations to
Sonic the Hedgehog: A History of the Software Industry”
(Cambridge: MIT Press, 2004), 198.
7. K-K Hong, Y-G Kim, “The Critical Success Factors for
ERP Implementation: An Organizational Fit Perspective,”
Information & Management 40, no. 1 (October
2002): 25.
8. L.M. Hitt, D.J. Wu and X. Zhou, “Investment in Enterprise
Resource Planning: Business Impact and
Productivity Measures,” Journal of Management Information
Systems 19, no. 1 (summer 2002): 71-98.
9. N. Carr, “Does IT Matter? Information Technology and
the Corrosion of Competitive Advantage,” (Boston: Harvard
Business School Press, 2004).
10. C. Koch, “The Monopoly That Matters More Than
Microsoft,” blog in CIO Magazine, Nov. 13, 2006, http://
advice.cio.com/the-monopoly-that-matters-more-thanmicrosoft.
11. R. Pool, “Beyond Engineering: How Society Shapes
Technology” (New York: Oxford University Press, 1997),
137.
12. C. Mann, “Why Software Is So Bad,” Technology
Review (July/August 2002).
13. J. Pontin, “Bjarne Stroustrup: The Problem With Programming,”
Technology Review (January-February
2007): 22.
14. J. Nicoli, interview with author at Trillium Software
(Billerica, Massachusetts), August 8, 2005.
15. S. Rosenberg, “Anything You Can Do, I Can Do
Meta,” Technology Review (January-February 2007): 45.
16. M. Wheatley, “ERP Training Stinks,” CIO Magazine
(June 1, 2000).
17. W. Orlikowski and C.S. Iacono, “The Truth Is Not Out
There: An Enacted View of the ‘Digital Economy,’” in
“Understanding the Digital Economy: Data, Tools and
Research,” eds. E. Brynjolfsson and B. Kahin, (Cambridge:
MIT Press, 2000), 355.
18. Ross, “Enterprise Architecture As Strategy.”
19. S. Rosenberg, “Dreaming in Code: Two Dozen Programmers,
Three Years, 4,732 Bugs, and One Quest for
Transcendent Software” (New York: Crown Publishers,
2007), 94-95.
20. Glass, “Facts and Fallacies of Software Engineering.”
21. D. Gelernter, “Mirror Worlds: Or the Day Software
Puts the Universe in a Shoebox … How It Will Happen
and What It Will Mean” (New York: Oxford University
Press, 1992), 51.
22. S. Shay, “CEOs Rate IT: Steady But Uncreative,”
CIO Magazine (April 1, 2007): 20.
Reprint 49101. For ordering information, see page 1.
Copyright © Massachusetts Institute of Technology, 2007.
All rights reserved.