Ontonix Complexity Measurement: A New, Comprehensive Metric for Project Management

2 September 2015 (© Ontonix) | The dynamism and volatility of a project or a portfolio (frequently changing scope, objectives, priorities, resource allocation, etc) imply that the project can no longer be managed and governed with static or predefined metrics, methodologies and best practices.

As the environment in which a project has to be managed becomes uncertain and unpredictable, it becomes necessary, if not mandatory to measure, manage and control the complexity of a project.

This is particularly true for mega projects that involve multiple stakeholders who may be geographically dispersed and that affect multitudes of beneficiaries. Such projects are too big to fail and may be termed too complex to fail (TCTF). Such highly complex projects, offering, introducing or enabling innovative technologies that must respond to organizational changes or business needs are inherently fragile. Their fragility is proportional to their level of complexity. A project with a fragile structure can suddenly, without warning, exhibit behaviors and reactions that could lead to unexpected results not in line with the defined objectives. It is therefore preferable to design and maintain a less complex project providing the same level of performances and results.

If properly controlled and managed, complexity will become a critical factor of success in the development and implementation of projects.

Therefore, it becomes important to adopt a comprehensive metric to objectively measure the complexity of large, complex and unpredictable projects or program environments.

Graci, Deshpande & Martin (2010) introduces how the complexity of any project can be measured. And through its measurement, complexity can provide a significant value add to the management. First, as an early warning indicator that can forecast and forestall possible crises in time-sensitive situations. Second, from a business intelligence point of view, allow identifying the main factors that generate or increase the level of complexity. The goal of managing TCTF projects will essentially be to decrease the level of complexity to the “physiological” limits of a project where complexity is properly balanced between benefits and risks.


Giuseppe Graci, Balachandra Deshpande & David Martin (2010). Ontonix Complexity Measurement: A New, Comprehensive Metric for Project Management. Ontonix.

6 comments

  1. […] Complexity – Complexity refers to the number of factors that we need to take into account, their variety and the relationships between them. The more factors, the greater their variety and the more they are interconnected, the more complex an environment is. Under high complexity, it is impossible to fully analyze the environment and come to rational conclusions. The more complex the world is, the harder it is to analyze. […]

    Like

  2. The Enterprise Titanic SOS and the Technology “Ice Berg”
    The RMS Titanic sank after hitting an iceberg April 15, 1912. Impossible by design. Hubris and greed caused one of the deadliest commercial maritime disasters in modern history with over 1500 victims. Corporations and governments now navigate in “ice cold waters”, always faster, a situation similar to the famous RMS Titanic disaster.

    1.8 Billion USD Box Office Hit and 11 Oscars
    James Cameron and Jon Landau and Stereo D needed 60 weeks and $18 Millions budget to produce the box office move in 1998. Titanic was one of the biggest cinematographic hits in the history of cinema. A total budget of $ 200 million, 1.8 billion global revenues and 11 Oscars, including best picture and best director awards.

    Today the rapid proliferation of technology “icebergs” make up an increasingly uncontrollable command and control environment of unprecedented proportions.

    Risk, governance and compliance (GRC) mandates appear increasingly disconnected and costly to implement. Legislators and consultants try to secure the “Corporate Titanic” from sinking again whilst the music is playing. GRC is a hot topic for most MNC (multinational corporations) now, where big advisory consultancies are jockeying for a piece of the action.

    Governance and Risk Quantification, a long and costly process
    MNC’s must prove that everything was undertaken in accordance to good governance, ethics to reduce risk exposure and not breaking the laws. Especially as manual GRC best practice is too slow and too costly (always out of date) as the target is constantly changing and moving.

    Mr D. Stein, General Partner from Endeavour Capital Limited, from Wellington New Zealand sums up today’s GRC efforts: “Never the less it is my belief that some boards have assumed a position of almost arrogance and conferred self-importance upon themselves, conveniently forgetting that they are accountable to the shareholder/investors.”

    However, for most “normal” (small to medium) and national companies GRC is simply a too complex, expensive and time consuming process.

    By not addressing GRC (where is the ROI?) corporations take the risk of losing control and become regulated subjects. Yet it is better to regulate complexity and risks yourself than to become regulated as the voyage is long and perilous.

    The “sinking” of our economic and social life?
    Both corporations and governments are about to produce their own “Titanic disaster” remake. Not a new cinematographic success, but the “sinking” of our own earthly economic and social existence.

    The idyllic visions of a connected world where Systems of Systems (SOS) depicts a fully integrated and streamlined business and living processes are fraught with issues. All technocrats say technology is the Godly answer to all our problems in business, banking, logistics, manufacturing or any other domain. Amen.

    People, networks, markets and processes are sucked into a system of critical inter-dependencies. A chain which breaks in its weakest point, with unprecedented consequences and damages “impossible by design”. Too often “the human factor” is given as the excuse when the culprit simply is excessive and uncontrolled growth of complexity.

    “Iceberg SOS” First Class Passengers and Captain. Abandon Ship!

    Second and Third Class Passengers: Everything is Under “Control”
    Enterprises economic, human, social and material situations increasingly resemble the famous RMS Titanic disaster.

    No Systemic Thinking: We maintain “Watertight Compartments”.
    The human mind is good at breaking down issues into smaller and manageable “chunks” and academia fosters linear thinking. Academics are “caught” in a mental “straight jacket” when all actually need lifeboats and life vests.

    As velocity, complexity and number of interactions keeps growing, managers increasingly loose oversight, foresight and insight. Systemic decision making ability diminishes as the number of interconnected and dependent parameters and uncertainty increases.

    “Iceberg” Crisis Anticipation
    Today’s “Titanic Iceberg SOS” is easily replaced by Computer Networks, Traffic Control Systems, Complex High-Tech Processes like nuclear reactors, energy networks, water and waste management or big ERP systems just to mention a few.

    When Governance is Out Of Control
    When governance is out of control, systems suddenly can exhibit irrational collapse or malfunction without any apparent logic, and this can generate unpredictable consequences. Some well-known examples are Chernobyl, Fukushima, TBTF Banks, and Telkom Networks, ATC and, even government’s economic strategy and social systems.

    Systemic Risk Exposure
    Individuals must learn to control irrational emotions, enterprises and governments must learn to measure their own complexity, taking into account their systemic risk exposure. Every piece of the puzzle interacts, and by reducing complexity we effectively limit risk exposure. However, the unforeseen can never be calculated.

    “Acts of God” or “the Human Factor”
    However, enterprises are also dependent on markets, competition, suppliers and customers. Unforeseen failures are often referred to as “Acts of God” or “the Human Factor”.

    Needless to say that this is a slightly vague excuse too often used by TBTF Institutions.

    Growing complexity affects efficiency and profitability. More technology also adds fragility, accentuating the problem.

    OCAP – Out of Control Action Plan
    When a process is “out-of-control”, the process owner looks for assignable causes by following the out-of-control action plan (OCAP) associated with the control chart. Out-of-control situations must refute the assumption that “control data” comes from the same sources as the data used to create (design) the initial control chart limits and process control model.

    In-depth reading: http://www.ontonix.com/Crisis-Anticipation.htm

    Complexity resides everywhere, also in management information systems (MIS). MIS executives struggle to steer their “ship” by factoring in all constraints, sometimes by simply ignoring them, which is like playing with “fire”.

    Core and Non-Core Business Activities.
    Increasingly enterprises actively separate core and non-core enterprise processes.

    Competitive cost pressure and ROI mandates, also affect IT/MIS departments, as well as, product design, development, manufacturing, delivery and services (flex. Cost of SAP ERP operations).

    Incessantly new digital services automate non-core and inefficient manual systems.

    Error prone humans are a high cost to enterprises, always seeking to gain a competitive cost advantage. Therefore we see a proliferation of new digitized on-line services such as as e-learning, procurement, recruitment, mobile applications accelerating business process re-engineering (BPR) efforts.

    How to Quantify Moving Technology “Icebergs”.
    Quantifying the digitized corporate ecosystem is mandatory to quickly understand (monitor) and pin point bad complexity, cost that must be cut first. This, demands both vision and will power to prevent the “ship” from forging ahead in ice-cold waters. Complexity can be measured (quantified) and managed in three ways:

    (i) Internet: “Top down from the outside in”. To quickly diagnose big hidden “icebergs” through exploiting reporting data from existing KPI business process. A DIY offering for SME/SMB who cannot afford “classical” BI/DW solutions and tedious model-building (mostly spreadsheets like Microsoft Excel).
    More in-depth reading: http://www.ontonix.com/CloudSolutions.htm

    (ii) In-House: Complexity Quantification (asynchronous or real-time) for mission critical systems or highly confidential data. This can be integrated with any SW platform (ERP system, Database, Data Warehouse, Business Intelligence system, or Workflow Management system). More on Enterprise QCM solutions: http://www.ontonix.com/EnterpriseSolutions.htm

    (iii) Super Computer: Deep complexity data mining of up to 1 million parameters with the assistance of super-computers to understand very large ecosystems to improve governance and understanding risk better from a strategic perspective.
    More on strategic insights: http://www.ontonix.com/file/ontonix_extreme_2014_public.pdf

    Vertical Expert Solutions
    Via our multiple strategic partners (Asset Dyne, RAS and multiple OEMs), we simplify and improve the support and resolution of complex decision making. Examples are stock portfolios, market risks by industry or macro-economic risks, as well as, lifesaving holistic medical technologies.

    Ontonix technology targets visionary clients seeking an efficient command and control center for digitized data processes so as to embrace and improve rapid best decision making abilities whilst reducing costs and cutting risk exposure.

    360° Management Insights
    Today KPI consultants aggregate data from the bottom-up using business intelligence (BI) tools and data warehouses / marts (BW/DM). Management wants 360° insights to govern better and information is feed from ERP systems such as SAP, Oracle Financial or Microsoft Dynamics. These “classical” data sources pretend to hold the “truth” needed for better business decisions. SAP currently has over 46,000 enterprises worldwide and is the “best” acclaimed highly functional ERP solution to MNC albeit complex and expensive to implement and operate.

    To this “paradox” a certified expert system exists from our business partner West Trax to completely measure SAP’s inner workings (KPI-Analysis) for efficiency and also to benchmark good and bad usages versus similar peer groups by size and industry.

    A surprisingly low number of SAP installations are well implemented; this represents a complexity risk and a major financial drain on precious MIS budget resources.

    We will discuss this in more detail in the forthcoming Part 3 of the – The Enterprise Titanic SOS and the Technology “Ice Berg” and how to help organizations in measuring their own SAP “Iceberg”. You will learn how to cut SAP MIS costs, reduce risks exposure with the objective to improve corporate ROI by exploiting SAP Centric KPI dashboards and Benchmark analytics.

    And the Titanic voyage continues…

    ERP is expanding by 10% each year in SME’s

    The ERP market is growing by 10 %, year by year – market leaders like SAP, Oracle and Microsoft all have their advantages and drawbacks. After MNC’s implemented (mostly) SAP (at high cost), the biggest growth is now found in the SME market segment. ERP SME implementation budgets are more constrained, hence optimum value for money is a “hot issue” to CEO and CFO putting extreme stress on CIO and ERP users and the chosen implementation partner.


    Running late and over budget

    Despite the improved enterprise control gained from ERP systems, most ERP systems implementations remain complex (and costly) with projects often running late (and over budget). Retooling and centralizing your enterprise operations “command and control center” is also done to cut operational costs. Implementing ERP involves business process design, human retraining and massive communication efforts. Hence measuring ROI and value add is a key parameter to management. When ERP projects go awry (25% in EU, reportedly up to 80% in development countries) top management gets grey hair and the “tough boys” get going – when your project goes off track the dark clouds (and costs) pile up.

    Taking your firm to the “ERP Nirvana” land is no easy task; ERP systems are often out of date upon implementation. No silver bullet exists as the “target” always keeps moving.

    Photo credit Amazon: Victorinox® Swiss Army Knife

    237 Billion USD wasted on Bad Complexity

    Even the CEO of SAP himself, McDermott admits publicly that complexity costs causes losses to the world’s top 200 MNC firms in the order of $US237 billions (i.e. 10% of lost profits under a mountain of waste, inefficiency and missed opportunity) – we say bad complexity!

    SAP Run Easy is not so “Easy”

    McDermott stresses this while advocating the “Run Easy SAP” campaign towards Cloud-based SAP4Hana. Migrating to the Hana Cloud is not so “easy”, even for SAP and their estimated 47.000 user sites. Migrating requires being on the latest release of SAP too. Rarely the case.

    Mr. Giuseppe GRACI Managing VP Consulting Services at Ontonix states: “Even before contemplating any ERP system (upgrade/transformation) – organizations must analyze and understand their resilience and relative complexity even more so for non-integrated best-of-breed systems moving to ERP.

    Measuring enterprise resilience and complexity helps management to quickly identify where the biggest productivity prospects can be found (effectively setting the priority agenda for your ERP transformation). By comparing the before ERP and after ERP QCM Benchmark we obtain a reliable benchmark of ROI improvements. This can of course also be performed when an existing ERP system is upgraded or enhanced concludes Mr. GRACI.

    Our ERP Implementation Partner Could not Get it Right

    As many clients of ERP systems report “implementation issues” the blame is almost univocally linked back to the client’s procurement attitude and relative ERP incompetence. The ERP software itself is not the issue. Implementation partners need to be held accountable for their failures, which in turn sour the stomachs of customers.

    One ERP client reported “we have been implementing SAP for almost 2 years now. We are only a simple call center and our implementation partner could not get it right”. This happens when buyers nickel and dime too much. Buyers who do not understand the organizational transformation tasks to accomplish (mainly HR and process education). Here innovative training tools come to the rescue. A new technology player like KST-Corp’s innovative i-Learning platform facilitates both staff training and certification with a less costly, less boring and highly dynamic multi-media experience. Their “pay by the drink” Cloud formula has proven both popular and highly cost effective too.

    Auditing Enterprise information KPI, Benchmark and Complexity Picture

    Ontonix jointly with West Trax is offering a unique and exclusive solution to address this inefficiency and wasted opportunity so many top managers are struggling with. We supply quantitative hard facts to make decisions on solid facts and not opinionated sentiments.

    60% of SAP Complexity found in Customized Code

    By simply measuring and analyzing both relevant enterprise data and your full SAP ERP system. We accurately pin-point most issues before, during and after implementation. Implementing and upgrading SAP is often a never ending project. Furthermore, too much “customization” code is installed, which sometimes is neither productive, nor useful. West Trax states that 60% of SAP complexity can be found in customization developments (ABAP).

    Divestitures, Mergers & Acquisitions, Restructuring

    With the permanent “crisis,” our economy has gotten unstable, make no error about complexity and “run as lean and mean” as possible says Ms. Diana Bohr the CTO of West Trax. Due to eroding margins, businesses consolidate into bigger groups to cap the fixed to variable cost ratio. This results in organizational sclerosis as complexity grows with the paradox of added fragility (risk).

    Furthermore, divestitures, mergers and acquisitions create havoc for ERP operations. As ERP systems must be reconfigured and restructured to conform to new legal structures and business model mandates. Multi-ERP-site SAP users typically also have major data stewardship issues in standardizing master data and leveraging the maximum cost benefits of a single SAP system structure into a “one for all, all for one” common infrastructure.

    With increasingly interdependent systems, organizations must measure and monitor systemic fragility SOS-risks and relative complexity contribution.

    By simulation of testing data, managers can interpret warning signals (i.e. “fire drills”) and correct governance before the problem fixing window goes out of control (OCAP).

    Human Limitation: 7 Parameter versus 1 Million

    Trained experts can handle up 7 different domains and only one parameter per domain – governing a complex high velocity enterprise, quickly reaches the human limitation. The risk of systemic control loss imposes efficient SOS (systems of systems) monitoring. In addition, organizations are flattened down into stressful matrix organizations or dynamic and rudder-less peer-to-peer (P2P) networks.

    Ontonix can handle up to 1 million parameters, a few hundred or thousand typically does the job for most clients.

    Continuous Agile Restructuring – the new ERP norm?

    This new context mandates continuous agile restructuring of all enterprise processes (HR, R&D, SCM, CRM…). A never ending transformation where the HR and sub-contractor function typically “explodes”. Enterprises must continue to rely on humans despite the pressure of added agility to always perform better, faster and cheaper. We do help humans perform better and faster at less cost.

    Looking for “ERP and Enterprise Rescue”?

    Ontonix and West Trax are both your Swiss Army Knife and Saint Bernard Enterprise Rescue Dogs. Your partners to a faster and a more precise surgical less risky enterprise ROI transformation.

    Written by:

    http://www.ontonix.com Alexander Kopriwa, VP

    http://www.westtrax.com Emil Bohr CEO

    Like

  3. A comparison between the old theory of risk and rating and quantitative complexity analysis
    The concepts of risk and uncertainty were defined in the economic ambit by Knight. He distinguished between measurable uncertainty, or proper risk, and the unmeasurable one. The word uncertainty is reserved for something not quantitative. Mathematical risk is objective and measured through some statistical variable, in the economic world, typically the probability of default is used. Cohen added the concept of subjective risk, this kind of risk is measured but the probabilities are chosen in a subjective way. Objective risk should be got from physical parameters or from a set of historic data, and subjective risk should be established without an objective base to assign probabilities.

    For instance, if we shake a dice, the probability of default when you bet for a fixed value should be got from the physical properties of the dice. A dice has six faces, if the dice was built homogeneously the probability of success should be 1/6 and the probability of fault would be 5/6. Sometimes we cannot have so easy a physical description of the problem but we can get a model from a set of historic data. If we shake the dice a high number of times and write the results on a list we can finally extract the probability of fail as the number of faults over the number of attempts. The results for a wide sample of data would be very near the theoretical value of 5/6.

    Uncertainty would be considered as something that a probability cannot be assigned although we can know all the possible results. If we know that the dice was not manufactured homogeneously and we have not an historical set of data, we cannot assign probabilities to every possible result although we know that there are six possible results.

    When we are analyzing the risk of an investment we usually manage two kinds of variables: some of them can be characterized and included in a model, and other ones are not characterized inside that model. The former ones would be under the concept of risk and the latter ones would be under the concept of uncertainty. The risk should be objective if we can define properly the probability for every possible result, and the risk would be subjective if we cannot define properly the probability for every possible result and we assign a value for those probabilities subjectively.

    This kind of risk analysis has two characteristics: the risk is modelled from the past assuming the future would be equivalent, and the uncertainty is considered negligible.

    Analyzing the first hypothesis, we cannot assume that an investment will have the past probability to provide some profit for the future because businesses and markets are evolving through time. Risk is never objective, it is always subjective because we are assuming subjectively that the future will be similar to past and this cannot be guaranteed specially if we enter in an unstable global scenario.

    The second hypothesis is not valid when complexity increases. In this case, uncertainty can move easily from a point of system to another one providing a typical unexpected behavior in our system making our model invalid because uncertainty is, due to its own nature, unpredictable by definition.

    If we cannot assume the first hypothesis, we cannot trust in our model, then model free techniques can provide an advantage from the classic way to analyze risks, and if we cannot assume the second one, we should find additional techniques that can let to consider uncertainty inside the analysis.

    It is moving towards a global scenario and in turbulent markets, where the concepts of complexity and fragility begin to show themselves as more proper ways to analyze risks than classical techniques.

    If we want to analyze the risk of a share at the stock exchange, the concept of systematic risk is related to the fluctuations of the market and the specific risk should be linked to the inherent value of the share. Both components would provide the total risk of the asset following Sharpe’s model of market. Modern complexity analysis defines fragility as the complexity of the system multiplied by the uncertainty of the environment. I am going to try to show you the differences.

    Sharpe considers fluctuation of market a risk, then, it is something following an objective or subjective distribution of probability (a model) and it cannot include the uncertainty of the environment.

    Techniques of quantitative complexity analysis consider complexity as a function of the structure and the internal uncertainty (due to characterized or known internal variables and uncharacterized ones). In order to make a similar analysis to the classic one, you should analyze market as a part of the system. In other words, in the same way as in classic risk analysis you use an index of the market to analyze the systematic risk, you can include the index of the market as an exogenous variable of the system that you are analyzing. The analysis would be model free but it will provide you information about how the effect of market fluctuations can be transmitted to other internal variables of the analyzed business.

    An important thing is to realize that in quantitative complexity analysis uncertainty is something uncharacterized as in the classic theory but, as a difference, it is measurable. Uncertainty is something due to unknown or not modelled variables but it is analyzed quantitatively through entropy. This provides a huge step forward to analyze strategies and investment decisions when uncertainty cannot be considered negligible.

    Business Complexity

    Like

Leave a comment