The Business Forum

"It is impossible for ideas to compete in the marketplace if no forum for
  their presentation is provided or available."           Thomas Mann, 1896


Sophisticated Tools or Afterburners on a Mule?

Contributed by Intellos Systems, Inc.
Author: John Dohm

 

 

Abstract 

The software industry has produced a large number of tools to provide corporate performance management (e.g., balanced scorecards), organizational alignment (commonly referred to as work intelligence), portfolio management, and collaboration/brainstorming.  These tools are seldom integrated with each other, often are overly complex, and many simply do not get used.  This paper investigates how far the software industry has gotten ahead of the mainstream need for software solutions.  While the particular focus is organizational effectiveness - primarily as it relates to projects - most of the commentary is applicable more broadly.

Introduction

Tools are meant to support easier and more efficient ways of completing tasks.  New technologies allow more sophisticated tools to be developed, often eliminating elements of work that were previously error prone, difficult, or banal.  However, as tools with integrated software evolve, information about processes can be, and is, embedded in the tools.  The introduction of software integrated into tools leads to a core risk that must be considered.  Specifically, the risk is balancing the reality that the embedded knowledge (again, typically as software) in the tool is not understood by the user against the increased value of the embedded knowledge toward improvements in the trade or profession.

For example, at one time, scientists were required to derive and calculate formulas and perform complex mathematical calculations by hand.  Over time, shortcuts in the form of tables and slide-rules were provided that allowed repetitive tasks to be completed more quickly.  The advantages were substantial - consistency of calculation and significant time savings - and outweighed any perceived value in performing the calculations.

As technology progressed, various other support devices were invented to augment the scientist.  However, each of these devices pre-supposed that the user understood the basis and process by which the tool supported the activity.  Basically, scientists could always revert to the backup (memorized or refer to a book) or use earlier set of tools.  Computers and robotics have changed this ability to revert to a “backup.”  Dramatic improvements in hardware technology over the last twenty years has made is possible to create sophisticated software that can embed just about any process.  Unlike before, however, the processes that are embedded are frequently undocumented, or at least not written in publicly available texts.  Furthermore, the knowledge within the software is not necessarily “common wisdom” that is stored and practiced in the minds of practitioners.  Instead, the knowledge is expressed in lines of code invented by software developers around the world.  Worse yet, the code is often unintelligible to the very people who actually understand the nature of the solution that the software is trying to address.

So, the key questions are:  Has software advanced to a point where the embedded knowledge is ultimately a threat to effective practice?  Or, is the reality that a sophisticated producer of software enables the rest of society to ignore the foundation of a craft and leap to new heights?  Moreover, are their areas where the “right” processes are well understood, but no one has bothered to codify them in software for some reason?  These questions will be explored in this paper.

KNOWLEDGE EMBEDDED IN SOFTWARE VERSUS BOOKS

One might argue that books are just a form of embedded software, and there are enough books on any given topic that the underlying knowledge cannot be lost.  Such an argument appears correct on its face, but there a few core differences in software solutions that make software unique. 

First of all, books simply describe a process.  The reader must read, interpret, and apply the process to be successful.  Different books may describe the same underlying topic differently and may offer alternate ways of drawing conclusions.  The reader can look at any number of books on any given topic and make a decision as to which books are most applicable to their situation.  Prior to books, this process would have involved visits and discussions, ultimately leading to slow, deliberate progress.

The interesting part is that books must stand on their own merits.  A book that leads the reader to a conclusion must be sufficiently convincing such that a reasonable number of readers could leverage the content to improve their thinking or performance.  Writings where the arguments and postulations are difficult to follow, verify, and validate are often ignored.  Even in the cases where there is merit in the concepts presented, but the underlying support is unsatisfactory, future authors have the opportunity to further the writings and advance the base of knowledge.

Alternatively, in the case of software, the process or conclusions drawn are almost always hidden from the user.  The design in completed by one team, the development by another group of people, documentation by a third group, with marketing and sales attempting to provide messages to the potential buyer based on their interpretation of needs in the market.  So, unlike books, customers are typically buying based on the representation of the quality of thinking that went into the software solution, independent of the manifestation of the solution.  More clearly put, the designers and developers do not provide access to their logic, yet the logic is manifest within the software developed.

The result of this situation is that the more feature rich the software, the less probable it is that the customer will be able to understand and test the embedded process.  Further, since most software is bought without a comprehensive understanding of the use of the processes that the software provides, the ability to anticipate which parts of the software will be needed (and, therefore, validated) is hampered greatly.  So, if the software has a great deal of embedded knowledge, and the user is not cognizant of which parts of the embedded knowledge will be leveraged, the user is constantly in a state of having to trust the software developer.

This trust, in the end, is the core difference between published software and published text.  One can debate and verify the content of text.  Authors are frequently questioned as to differing viewpoints or asked to support their conclusions.  In the case of software, the only way to verify that the solution is satisfactory is to use the solution in a particular situation and validate that the embedded process within the software consistently meets the needs of the customer.  If it does not, there is little opportunity for debate.  Instead, the choices are reporting a problem, simply altering the way of doing things to accommodate the software, or customization.  Let’s explore these choices.

Report a Problem

Say that you have found something in a software solution that seems to be in error.  Since you typically have no access to the process, the method by which one could test the error is to test the inputs and inspect the outputs.  Assuming that the software does the same thing the same way every time, this process should yield a consistent set of results for error reporting. 

Unfortunately, because many software solutions interact with other software solutions, this kind of error reporting is quite difficult.  A software developer is capable of tracing (i.e., watching their program execute), but the customer is typically not privy to the software code.  Now, one might ask why software code from developers is not readily available.  For decades, there has been one sort of “open source” (i.e., publicly available copies of software code) or another.  Without digressing on a topic outside the scope of this paper, suffice it to say that commercial software publishers have had more financial success, and probably more influence, that any “open” solution.

So the moral of the story is that an error, even if a user could reliably demonstrate the error, can always be attributed to a lack of understanding of the process embedded in the software (to which the user typically has no access or, if they have access, insufficient knowledge of how to contend with the error) or inconsistent environmental factors (e.g., different machines, interactions with other software.)  As such, error reporting for software is, in many cases, marginally effective.  The reaction is a renewed emphasis on an old approach of centralized software publishing and execution.  This used to be called centralized computing or mainframe computing, but in the new era this is commonly referred to as Software As a Service, or SaaS, often provided by an Application Service Provider, or ASP.

Change What You are Doing

Whether using purchased software or leveraging an SaaS/ASP model, the safest choice is adopt your thinking to that of the software you use.  Software publishers frequently tell customers that their processes are well thought out and designed, and, therefore, should be adopted straight away.  For those customers who accept this story, a few key items might be of interest.

First, well thought out process should be published.  The best way to ensure something is designed well is to open the design to public scrutiny.  Software publishers rarely publish their designs - much less their actual software - claiming that this would facilitate copying or replication.  This argument is nonsense.

The basic argument is that some entity, if they understood the process and had the software, could enter the market and deprive the software producer of profits.  This is unlikely for at least three reasons: 

1) Some entity would have to make an investment in creating, maintaining, marketing, and supporting the software.  This sort of endeavor has substantial barriers to entry and requires a great deal of faith;

2) A theoretical competitor would be bringing an undifferentiated product to market, and it would be necessary for the competitor to “out execute” the original firm even when that original firm has developed substantial intellectual property and know-how; and

3) Any firm replicating existing software or inappropriately leveraging copyrighted software would be subject to imposing legal penalties.  Furthermore, copying and/or reverse engineering software is already quite straightforward, so legal barriers must be the primary enforcement mechanism.

Second, if your organization cannot differentiate itself with the particular set of processes embedded in the software, the goal should be to minimize cost.  In other words, put the cheapest, functional solution in place.  The idea of spending a great deal for undifferentiated process in software is counter to good business decision making.  In fact, this is close to the definition of a commodity.  If no one can differentiate, the price of the software should continue to go down.

Third, if your organization thinks that differentiation is important, then why are you buying a commercial software solution?  You are clearly in the realm of custom solutions.

Customization

If the processes that you need are not standardized, and/or there is, in fact, value to be added from innovation, you are clearly in the camp of customization.  The core concept in customization is that by embedding a unique set of knowledge in a software solution, you can obtain benefits not easily achieved by a competitor.  So your choice is to customize an existing solution or build a solution from scratch. 

Customizing an existing solution assumes that the software publisher has incredibly well engineered software.  Without clear and well documented processes, as well as support for the entry and exit points to these processes, the customization job is massively expensive and infinitely inflexible.  Absent of a comprehensive architecture and tremendous trust in the vendor, the customer is subject to changes to both the design and the processes embedded in the solution.

Not that building a solution from scratch is necessarily much cheaper.  However, the incremental cost allows the customer a far greater degree of control and increases flexibility greatly.  In cases where the understanding of the processes is evolving or where the processes themselves are evolving, custom design and development is the best bet.

IT IS NOT A BOOK

While our discussion of process in software is incomplete, the fact that a great deal of software is insufficiently understood embedded process should be clear.  Why should a project manager care about this?  Because many of the software tools supporting portfolio, program, and project management are process free or process incomplete.  The tools assume that a process exists and have common services that the designers believe would apply to projects.  As such, many of the tools have an excessive number of features and capabilities that are seldom used because they have no context.

Okay, so you may be wondering what is the point?  The point is that the tools need to help doers organize their work, managers ensure that the work is on track, and executives maximize the business value from the work.  So the logical question is this:  does your organization have a process to ensure that the action is linked to intent? 

This is what tools are for - to provide an expedient way to complete work.  Tools facilitate action.  Tools improve the efficiency of a given task or set of tasks, they allow new things to be done what were not previously viable, and they exist within the context of particular opportunity or problem.  Since tools are contextual and fit within a process, the core assumption is that the engineering behind the tool required a reasonable understanding of the process behind getting work done. 

This is where the problem exists.  Many of the tools that support organizational or group effectiveness are absent of process, and most of the customers who buy these tools are process incomplete or process free as well.  The result is tools without process or Afterburners on a Mule.

AFTERBURNERS ON A MULE?

The concept is simple - an engineer wants the mule to go faster.  Mules are beasts of burden, so a mule that can carry a large load steadily that goes fast is appealing.  Therein is the rub - the mule is can sure-footedly carry a heavy load because it is slow.  Putting afterburners on the mule makes it unstable.  Unstable mules are useless, so fast mules are useless.

To see if we buying afterburners for mules, look at the three parties trying to drive work in organizations and teams:  the financier (who wants a return on their investment with as little risk as possible); the project manager (who wants to get projects done); and team members (who want to do work with an appropriate amount of effort.) 

The Financier

The financier sets the direction for the work to be undertaken.  They are responsible for providing funding and selecting a good project manager.  They are accountable for delivering some sort of business result.  No project can begin without a sponsor as there would be no reason to undertake the work.  As such, a process must exist to ensure that the sponsor knows what is necessary to be a sponsor. 

The Project Manager

The project manager negotiates the project constraints (i.e., scope, schedule, budget, quality) to establish an expectation as to what will be delivered to the sponsor.  The project manager is responsible for hiring/firing staff and ensuring that the output meets the needs of the sponsor.  They are accountable for delivering against the expectation set with the sponsor.  Project managers must be familiar with the processes that support project performance and monitoring.

The Team Members

The team members negotiate their tasks within task constraints (i.e., scope, schedule, budget, and quality) to establish an expectation as to what will be delivered to the project manager (N.B. same concept applies if there are team leads.)  The team member is responsible for estimating their work and reporting performance so that the project manager can effectively carry the project forward.  They are accountable for delivering work elements within the expectation set with the project manager.  Team members need to know how to estimate and deliver within their estimates.

AFTERBURNERS ON A MULE!

Given these broad definitions of the roles, processes must exist at all levels and they must be standardized across the affected parties in an organization.  This is true because the affected parties vary dramatically based on the scope of the work.  Some projects are departmental, others affect many parts of the organization, and a few integrate with external entities.  The latter are becoming more common, accentuating the needs for standardization.

Unfortunately, the mechanisms by which negotiations take place, work is planned, prioritized, and estimated, and how resources are allocated, differ greatly.  This is why project management, and for that matter, many management tools, are ineffective.  Customized processes for common problems, often developed on an ad-hoc basis, are, in many cases, not communicated, documented, or followed.  And while an argument for on-the-fly customization can be made, any benefits reaped from an inconsistent process for project work are vastly outweighed by the cost.

So we have a choice - buy sophisticated tools and put afterburners on the mule or get the right processes in place, then leverage tools where they supply value and build whatever you cannot buy.  Since only the latter makes sense, the sole remaining question is whether an organization has the will and discipline to define, document, and follow a common set of processes.  Why would they?

LET’S LOOK AT THE WEB

Years ago, there were arguments over network topologies and protocols.  Topologies are mechanisms by which information is distributed to different parts of a network.  Protocols are the rules, similar to the grammar of a language, that govern the transmission of the information.  After a decade or so, Ethernet became the standard topology and Internet Protocol (IP) became the standard protocol.  Once this standardization took place, connecting computers of all types became easy.

This foundation was (and is) necessary to maximize the value of the connected systems.  Without some basic standardization at a fairly low level, vendors would compete on their support for different solutions in the market.  The advantage is that no vendor could rationalize creating a competing solution because the value of making the connection to the network was of more import. 

Most software has this fundamental problem.  The producer of the software is ultimately trying to create a solution that is general enough to be widely adopted but specific enough that it solves a real need.  Like the example with Ethernet and IP, most software solutions are attempting to take a set of processes and standardize those processes as part of their value proposition.

The hard part is determining when the process in a particular solution is good enough to have a reasonable chance of being a standard solution.  In the case of networking, the company 3Com was created to promote Ethernet networking.  Ethernet become the standard topology and 3Com did very well even though there were competing solutions.

For project management, no one has defined the equivalent of Ethernet.  The market is fighting at every level of the project management stack, whether it be corporate performance management (e.g., balanced scorecards), organizational alignment (commonly referred to as work intelligence), portfolio management, and collaboration/brainstorming.  As such, little progress is being made.

While efforts have been made to drive towards a standard vocabulary, these efforts are generally proving futile (at least in the short run.)  What is needed is the grammar, a standard way to organize information with respect to projects.  This grammar must be broadly applicable, which means it must be simple, practical, published, open to public scrutiny, and, in my opinion, embedded in a software solution. 

Furthermore, since the roles associated with projects are fairly generic and generally applicable, there is great opportunity for standardized process that can be embedded in a toolset.  One could argue that there is an excellent opportunity to apply standard methods today.  The value from any given organization developing a process for managing a project portfolio is fairly limited since the elements and steps tend to be quite common.  Nonetheless, many organizations and departments continue to spend time creating processes under the assumption that there is some substantial benefit from process development.  Of course, tool vendors accommodate this practice by providing workflow components, forms development capability, and methodology customization. 

While the benefit of inventing project process is limited, the value from an organization (or industry) adopting a standard way of linking projects to intent, planning work, and monitoring progress is great.  This is a big statement, but there is a reasonable case to be made that re-inventing project process is not working.  Moreover, there is a fair amount of evidence that education, coaching, and support, while necessary, are insufficient to drive improved value for dollar invested in projects.

Ultimately, this paper is advocating the adoption of a standardized process for driving Business Intelligence through Project Intelligence.  While some will argue that methodologies are plentiful and cheap, few have progressed to the level where they meet the criteria mentioned (i.e., documented, publicly available, and embedded in a toolset.)  Moreover, none seem to be sufficiently comprehensive such that they can meaningfully link strategy to action.

In conclusion, until organizations focused on improving project results adopt a standard set of documented processes that are well understood and can easily be deployed, the use of tools in the project management space, and many other software “solutions,” will continue to be Mounting Afterburners on a Mule. 


Visit the Authors Web Site

Website URL:

 http://www.intellosys.com/

Your Name:
Company Name:
E-mail:

Inquiry Only - No Cost Or Obligation


 


3D Animation : red star  Click Here for The Business Forum Library of White Papers   3D Animation : red star
 


Search Our Site

Search the ENTIRE Business Forum site. Search includes the Business
Forum Library, The Business Forum Journal and the Calendar Pages.


Disclaimer

The Business Forum, its Officers, partners, and all other
parties with which it deals, or is associated with, accept
absolutely no responsibility whatsoever, nor any liability,
for what is published on this web site.    Please refer to:

legal description


Home    Calendar    The Business Forum Journal     Features    Concept    History
Library     Formats    Guest Testimonials    Client Testimonials    Experts    Search
News Wire
     Join    Why Sponsor     Tell-A-Friend     Contact The Business Forum


The Business Forum

Beverly Hills, California United States of America

Email:  [email protected]

Graphics by DawsonDesign

Webmaster:  bruceclay.com
 


© Copyright The Business Forum Institute 1982 - 2009  All rights reserved.