Charan/ March 5, 2024/ Agile/ 0 comments

The ability to accurately forecast and meet product release dates and project delivery continues to be a vexing problem in many organizations. Traditional management approaches often rely on deterministic models, which fail to account for the complexity and uncertainty inherent in modern product development. These methods can lead to overly optimistic timelines, false assumptions, underestimation of risks, and lack of flexibility, resulting in delayed launches, budget overruns, and diminished stakeholder confidence. Agile practices, on the other hand, are more often than not, shoe-horned in. Additionally, they might also run afoul of established governance practices leading to the adoption of bad behaviours that plagues traditional approaches. Consistently creating good forecasts and meeting them is a key operational competitive advantage. In the current macro economic environment, this can make the difference between a fantastic year or a poor one.

The Pitfalls of Forecasting

At its core, forecasting represents a sophisticated optimization challenge, one that seeks to minimize schedule subject to budgetary constraints. This equation is subject to myriad of variables including understanding and prioritization of work, order of execution, allocation of skilled personnel, and the ever present spectre of risks. Despite the historical use of optimization tools like MS Project, CAPPM, etc. forecasting accuracy estimates continue to be abysmal.

Agile, while attempting to be nimble, often side-steps the intricate dance of optimization in favour of adaptability and speed. Practiced well, organizations have benefited strategically from this speed and adaptability as is evidenced by modern technology organizations. Enterprise ITs, however, in an attempt to reinvent themselves are failing miserably at becoming good at either.

The quandary of slipping deadlines often results in introspection on estimation accuracy, and a common reflex is to double down on improving accuracy of estimations. In the pursuit of more “accurate estimations”, timelines often manifests as added buffers, inadvertently entangling us in the human psychology of work—Parkinson’s Law and the Student Syndrome—where deadlines stretch and urgency begets action, paradoxically inviting delays.

Deeper reflection, however, unveils a more promising path forward: an approach firmly rooted in the time tested and rigorous disciplines of mathematics and statistics aimed squarely at improving forecasting and predictive accuracy.

Statistical Approach: A Strategic Overview

At the heart of a statistical approach to forecasting lies the capability to generate a wide array of potential outcomes, each reflecting the complex web of uncertainties that characterize any project execution or a product development life cycle. This method elevates forecasting from an intuitive art to a precise science, offering a broad spectrum of possible delivery dates that account for every critical variable — from supply chain disruptions to unforeseen technical challenges. It achieves this through detailed iterative large scale modelling of historical completion rates, future project scope and tasks. If Monte Carlo Simulation springs to mind as a prime example of this approach, your intuition is spot-on.

Operationalizing The Strategy

To translate this strategic forecasting approach into actionable outcomes, several pivotal steps must be undertaken, each aimed not only at refining forecast accuracy, but also enabling the innovation to stick:

  1. Executive Acknowledgment: The foundational step involves recognizing at the executive and management levels, that the current state of forecasting accuracy is below par and necessitates improvement to meet strategic objectives. Without acknowledging this need, the impetus for change is absent, and the status quo remains unchallenged.
  2. Team Selection: Identify one or two teams with a keen appetite for innovation and improvement. These teams will be crucial in pioneering the adoption of the innovative approach.
  3. Data Integrity: Prioritize the collection of reliable data. Implementing a new tool with built-in Monte Carlo analysis may not be feasible. Instead, focus on leveraging existing tools to accurately timestamp work status changes, completed dates and ensure accessibility of change logs. Garbage in. Garbage out.
  4. Automation: Although I have personally leveraged both Excel and Python to develop the simulations, Python is my preferred option. However, any automation capable of interfacing with your data through APIs and processing them will suffice. It is advisable to employ automation over manual methods for efficiency and scalability. Care must be taken to ensure least amount of time is spent on non-value add (but needed) activities.
  5. Simulation Preparation: Begin by amassing historical data on cycle times, alongside calculating throughput rates (completed work items over a chosen time period). Factor in the number of work items still in the pipeline, making adjustments to account for anticipated growth especially in agile environments. A dataset covering 6-8 weeks, with around 30-60 completed items, sets a robust stage for initial modelling. However, modelling can start even with as little as 4 weeks of completed items.
  6. The Simulation: This is where strategic foresight is cultivated. The simulation systematically calculates the myriad of possible outcomes across the entire probability spectrum for each work item. The simulation allows for the delineation of timelines with varying degrees of aggressiveness across the confidence spectrum. Opting for a lower confidence level paves the way for more aggressive, albeit riskier, timelines. Conversely, a higher confidence level helps pinpoint the most probable timelines, offering a balanced view between ambition and realism.

This statistical approach not only harnesses the power of historical data but also introduces a methodical way to navigate the inherent uncertainty in delivery timelines. Through the strategic application of Monte Carlo Simulation, executives can derive actionable insights, enabling the establishment of timelines that are both ambitious and anchored in statistical probability.

For example, consider the clarity that a data-driven approach can add the progress updates:

Other things being equal, and based on rolling 6 week average completion rate of 4 work items/week, we have a 85% chance of achieving a release/delivery between 8 Sept, 2024 to 1 Oct, 2024. Here are two key factors impacting our weekly throughput: ambiguity in scope and acceptance criteria, reduction in the allocation of key team members’ time on the product; and here are actionable items that need executive attention.

This level of precision not only enhances your ability to manage expectations internally and externally but also demonstrates a commitment to operational excellence and strategic foresight. It also invites further data driven and logical discussions on factors impacting model inputs and reduce variation.

By adopting advanced statistical methods for forecasting, such as the Monte Carlo Simulation, you empower your team with the insights needed to navigate uncertainties with confidence and precision.

Share this Post

Leave a Comment

Your email address will not be published. Required fields are marked *


This site uses Akismet to reduce spam. Learn how your comment data is processed.