Agile Measurements: Burndown and Burnup Charts


This is a continuation of a discussion based on Chapter 5 (Implementing Agile) from the Agile Practice Guide, a publication jointly produced by the Project Management Institute and the Agile Alliance.

In the previous posts, I compared measurement of a project’s progress in a traditional vs. an agile environment.    A traditional measurement system such as Earned Value Management measures work completed; an agile measurement system measures work completed and delivered to a customer.    So it includes what in a traditional project management scheme consists of the following processes

  • Process 8.3–Control Quality (internally verifying that the work conforms to the quality standards for the project)
  • Process 5.5–Validate Scope (externally validating with the customer that the work conforms to the requirements for the project)

In other words, you measure whether the work is complete in a traditional measurement system; you measure whether the work is correct in an agile measurement system.

Two of the common systems for measuring progress on a project in an agile environment are a burndown chart and a burnup chart.   The raw data for both types of charts is the same, the number of story points.   The number of story points is an estimate of the effort required to complete a particular user story (feature) from the feature backlog.   The difference between the two types of chart is simple:

  • a burndown chart starts with the total amount of story points expected to be completed in the course of a project, and as each user story is completed, the chart shows the remaining story points
  • a burnup chart starts at zero, and as each user story is completed, the chart shows the completed story points

At the end of the project, a burndown chart will show zero remaining story points, and the burnup chart will show the number of story points completed for the whole project.  For examples of these graphs, see Figure 5.1 on p. 62 and Figure 5.2 on p. 63 of the Agile Practice Guide.

In a traditional project environment, the baseline may change if there are any changes to the scope during the course of the project.   Similarly, in an agile environment, if the scope changes during the iteration, the burnup or burndown charts will also change to accommodate those changes.

In agile, the velocity is a sum of the story points sizes for the features actually completed in the current iteration.    It is useful because, if the project continues at the current velocity of work production, you can take the number of remaining story points and divide by the velocity (number of story points done by each iteration) to get how many iterations it will take to complete the project.

It may take four to eight iterations to achieve a stable velocity.   After that, if the velocity changes, it will have a direct effect on the length of time it may take to complete the project, so it is important to measure velocity and compare it to the velocity of previous iterations so you can see whether there is any changes (up or down).

Now, the description above applies to iteration-based agile measurement.   If you use flow-based agile measurement, which does not use the regular cadence of an iteration, it is based instead on the completion of a particular user story or feature.   The next post will go into further detail into the flow-based agile measurements of cycle time and lead time.

 

Measurements in Agile vs. Predictive Projects (3)–Agile Measurement and Estimation


This post will conclude my comparison of measurement of progress on projects in a traditional environment (using earned value analysis–contained in the first post) with measurement on projects in an agile environment (covered in the second post and this one).

Summing up the last two posts, as opposed to a predictive measurement system like earned value analysis that focuses on the completeness of the work, an agile measurement system will have the following characteristics:

  • It will focus on customer value added.
  • It will focus on quality (the correctness of the work), so that a feature is considered finished not when the team has done the work, but after the team has tested it and the customer approves.

Here are some additional features of agile measurement of progress on a project.

  • The chunks of work being measured are made smaller, so that people are more likely to deliver on it.
  • Product development involves a learning curve as well as delivery of value to a customer.   By keeping the work increments small, this allows for more feedback from a customer, which loops back to the team and causes them to improve on the next work increment.
  • Rather than trying for a heroic pace to get done as quickly as possible, a steady pace is preferred that allows enough time to get the work done correctly.

The “steady pace” referred to in the last point above is important for the purpose of estimation.   A sponsor who wants to know when a project will be completed will be best served by a steady pace of work, because this will allow a simple calculation of

The number of remaining story points/remaining number of story points done per iteration.   With 500 story points remaining and 50 done on average per iteration, you can tell the sponsor with confidence that you will be able to get the project done in 10 iterations.   If each iteration is two weeks, let’s say, then you can see it will be done in 20 weeks.

This post reviewed the characteristics of agile measurement and estimation.   The next posts will go into the details of this type of measurement based on the material on pages 62-70 of the Agile Practice Guide.

Measurements in Agile vs. Predictive Projects (2)–Agile Measurement


I’m going through the Agile Practice Guide in order to understand its contents by adding context to the material in the guide.

Section 5.4 of chapter 5 deals with measuremens in agile projects.   To give context to the discussion on p. 60 of the Guide, I did the previous post comparing agile measurement to measurement in a traditional project management setting.

There the measurements are typically done with a tool called earned value analysis.   This takes the following three building blocks of measurement …

  • Planned value (PV)–this is the authorized budget assigned to the work that is scheduled to be done.   (The key word there is “scheduled”.)
  • Earned value (EV)–this is the measure of work actually performed.   (The key words there are “actually performed”–it is a a measure of how much scope was accomplished.)  This is expressed in terms of the budget authorized for that work.
  • Actual cost (AC)–this is the measure of the actual cost for work actually performed.  (The key word is obviously “cost”.)

… and combines them into formulas that either give a shapshot of how the project is actually doing compared to the plan (both schedule and budget):   these are status measurements.  There are other formulas which predict whether the project will end up being on schedule and within the budget if the current trends continue.    These are predictive measurements.

The problem with earned value analysis is that it is precise, but can be inaccurate if the assumptions underlying the data turn out not to be true.   For example, let’s say you have a work package that was supposed to be completed by a certain date, and it actually is completed by that time and for the budgeted amount of money.    The schedule performance index and cost performance index for that work package should turn out to be 1.0.

But this focuses on the completeness of the work, which is the domain of scope; what about the correctness of the work, which is the domain of quality?   If internal testing is done on the unit of work done, and followed up by integration testing of how the system works with the newly-installed unit, it might turn out that it doesn’t work as expected according to the definition of “done” (the acceptance criteria).   Then the work has to be re-done, and the testing as well.   That’s one problem.

Another problem is where you hand the finished module or unit over to the customer for their inspection.   They say “that’s not what we ordered.”   The acceptance criteria are vague enough that there is a difference between what the customer intended and what the team thought it was doing to meet that requirement.

So, as opposed to a predictive measurement system like earned value analysis that focuses on the completeness of the work, an agile measurement system will have the following characteristics:

  • It will focus on customer value added.
  • It will focus on quality (the correctness of the work), so that a feature is considered finished not when the team has done the work, but after the team has tested it and the customer approves

There are additional features of agile measurement discussed on p. 61; these will be the subject of the next post.

Measurements in Agile vs. Predictive Projects (1)–Earned Value Analysis


I am going through the Agile Alliance Guide chapter by chapter, section by section, page by page, in order to understand its contents.   I am blogging about what I’ve read in order to add value for those who may be coming from a traditional project management background.

So, for example, I am starting section 5.4 on p. 60 which deals with measurements in agile projects.   How can you tell whether your project is on track for success or not?

Before I discuss the introduction to this section on p. 60, however, I want in this post to discuss how measurement is done on a traditional project, which usually has a predictive life cycle.   In this way, the contrast to how measurement is done on an agile project can be better appreciated in context.

Let’s review what the characteristics of a predictive life cycle are as compared to the other three types of project life cycle (iterative, incremental, and agile):

  • Requirements–these are fixed at the beginning of the project and change is managed as if it were a necessary evil (the other three life cycles have dynamic requirements and change is managed as if it were a positive good)
  • Activities–these are performed once for the entire project (in incremental, they are performed once for a given increment, and in iterative and agile they are repeated until correct)
  • Delivery–there is a single delivery of the final product (this is true in iterative as well, but in incremental and agile there are frequent smaller deliveries during the course of the project)
  • Main goal–to manage cost (in iterative the main goal is the correctness of the solution, in incremental it is speed, and in agile, it is customer value)

The main characteristic to focus on in our discussion of measurement is the first one, that of fixed requirements.   This allows you to breakdown the work with a tool called the Work Breakdown Structure.   This is then used to create both the schedule and the budget.   These in turn are used as inputs to a data analysis technique called Earned Value Analysis or EVA.   Remember, the goal of measurement is to

  1. compare the actual work done vs. what was supposed to be done and see if there is a variance (I call this the “snapshot” function of measurement)
  2. predict what resources will be required by the end of the project to complete it given the trends of those variances discovered (I call this the “crystal ball” function of measurement.

Okay, let’s go back to how EVA works.   The three building blocks of formulas dealing with variance are the following, which are based on the triple constraints of schedule, scope and cost, respectively.

  • Planned value (PV)–this is the authorized budget assigned to the work that is scheduled to be done.   (The key word there is “scheduled”.)
  • Earned value (EV)–this is the measure of work actually performed.   (The key words there are “actually performed”–it is a a measure of how much scope was accomplished.)  This is expressed in terms of the budget authorized for that work.
  • Actual cost (AC)–this is the measure of the actual cost for work actually performed.  (The key word is obviously “cost”.)

Let’s take a simple example to show how these are used.   Let’s say your project is to have a room painted, assuming that all of the preparation work has already been completed.  It will cost you $1,000 per day to do each of the four walls of the room.

Scenario one:   At the end of day 2, you only have one room painted.   Your gut feeling is that the project is behind.   What does EVA tell you?

The schedule performance index (SPI) is EV/PV.   What is EV?   It is the end of day 2, and you actually did only one wall, which costs $1,000 to do.  So your EV is $1,000.   What is the planned value?   Although you did one wall, you were supposed to (according to the schedule) do two walls, which would cost you $2,000 to do.  So your PV is $2,000.   Plugging these values in the formula, you get 0.5 as the result.   If your result was 1.0, you would be on schedule.   If your result is greater than 1.0, you are ahead of schedule, and if your result is less than 1.0, you are behind schedule.

The cost performance index (CPI) is EV/AC.   Let’s say that, it is the end of day 2, and were able to complete two walls, but on day 2, you realize you needed to add an extra painter (assuming you planned for two painters costing $500 each for labor and materials, but had to add a third painter in order to get the work done on time).   The SPI is going to be 1.0, because you are on schedule.   But what about the CPI?   The EV is $2,000, because two walls were done, but the budget authorized for doing two walls is $2,000.   The AC, however, doesn’t care about the budget authorized the work; it cares about what the actual costs of the work turned out to be.   This is $1,000 for day 1, but $1,500 for day 2 because of the extra painter, so the cumulative AC is $2,500.   Now using the CPI formula you get $2,000/$2,500 or 0.80.   Anything less than 1.0 with either the SPI or CPI is not good.   In this case, it means you are over budget.

So you can see how this works.   Because the schedule and budget are, at any one time, fixed, you can use EVA to create a measurement which is precise.   But it is accurate?  Let’s review what these two words imply.   A measurement is precise if the measurements are closer together (using smaller units).   A measurement is accurate if the measurements are close to the actual value that is being measured.

If I am at a pub playing darts, my precision will decrease as the amount of beer I have increases.   The fine motor control needed to throw the dart exactly where I want is affected by the alcohol, and I end up throwing more wildly as time goes on.   Now the accuracy increases the more times I play in the pub, because my brain learns and causes me to get a bulls-eye more often.

The fact that EVA measurements are precise can fool you into thinking they are accurate.  If someone says they are 90% completed with their work, you can use this data to say that the SPI is 90%.   But what if their own self-assessment is wrong?  Or what if they don’t want to admit that they aren’t as far along as they should be?   If someone tells me that the completion of their work is “just over the horizon”, that might make me feel better–until I look up the word “horizon” in the dictionary and see that one of the definitions is “an imaginary line that gets farther away from you the closer you approach to it.”   Well, that’s not comforting at all, is it?

And it gets worse.   Let’s say that the person IS actually telling the truth and that the work is 90% done.   They turn in the work and then the testing is done, first of the module itself and then of the entire system with the module added (an integration test).   It doesn’t work as it is supposed to, and it has to be reworked.

Okay now it is re-tested and it works fine.   It is delivered to the customer and the customer says “that’s not what I ordered.”   The user story was not clear or objective enough, and what the customer thought was being ordered isn’t what the team thought.  This is why you don’t use subjective acceptance criteria like “I want the product to look nice” because there’s a LOT of room for disagreement about what “looks nice” means.

So whether from your standpoint you are within the budget and/or schedule, in the end run it doesn’t matter if the work that was done does not add value from the customer’s standpoint.

So, as opposed to earned value analysis, which is a more precise measurement tool but may have problems with accuracy, measurement in agile projects uses tools that may not seem at first glance to be precise from the standpoint of traditional project management, because they are dealing with units (user story points) that seem to be more subjective than dollars and cents, but they are more accurate in that they are focused on actual customer value.

With that background of measurement in traditional project management in mind, let’s turn in the next post to characteristics of measurements in agile projects.

Troubleshooting Agile Project Challenges (5)–Testing Implementation


On pages 58 and 59 of the Agile Practice Guide, there are twenty-one challenges or “pain points” described together with the suggested solution(s) to the problem.   However, they are listed in a random, laundry-list fashion without much rhyme or reason to the order.  So what I have done is reviewed all the suggested solutions and grouped those challenges that require the same type of solution.   These five types of solution are:

  1. Production of agile charter
  2. Product backlog/user story definition
  3. Kanban boards
  4. Focus on team roles/responsibilities
  5. Testing implementation

I have already covered the first three solutions, production of an agile charter, production and maintenance of a product backlog, and the use of kanban boards, in the past three posts.

The production of the agile charter was essential for challenges dealing with the context of the project (what is the product vision?, what is the organization’s mission for doing the project?, etc.).

The production, refinement, and maintenance of the product backlog helped with challenges dealing with the requirements of the product and how these are reflected in the user stories that make up the product backlog.   It also showed how to adjust the user stories that comprise that backlog in order that the team not bite off more than it can chew during any one iteration.

And the challenges dealing with the process of the project, i.e., doing the work and then reviewing it during retrospectives, were met by the use of kanban boards to visualize those processes, which then in turn makes it easier to pinpoint bottlenecks or barriers so they can be focused on and removed.

The fourth series of challenges dealt with problems that could be solved by clarifying the roles of individual team members (having them work on cross-functional vs. siloed teams, for example), as well as the critical roles of product owner and servant leader.

Finally I’m covering the last two challenges (out of the total 21 challenges presented on pages 58 and 59).   These have the comment solution of paying attention to the implementation of testing after the design of a particular feature is complete.    Previous solutions have focused on the completeness of the work, which in traditional project management terms would be the Scope Management domain; this solution set is focused on the correctness of the work, which in traditional project management terms would be the Quality Management domain.   Specifically, it deals with quality control, the correctness of the product itself (does it meet the expectations of the customer).   The other aspect of quality, quality assurance, deals with the correctness of the process, and this is generally taken care of during the retrospectives at the end of each iteration (the use of kanban boards to visualize this process is helpful in this regard).

Now, here are the two remaining challenges or “pain points” that the Agile Alliance Guide discusses in their chart.

  1. Defects–focus on technical processes using techniques such as:
    • Working in pairs or groups
    • Collective product ownership
    • Pervasive testing (including test-driven and automated testing approaches)
    • A robust definition of “done”, i.e., acceptance criteria
  2. Technical debt (degraded code quality)–like the response to the challenge listed above, focus on technical processes using techniques such as:
    • Code refactoring–the process of clarifying and simplifying the design of existing code, without changing its behavior.   This is needed because agile teams often maintain and extend their code from iteration to iteration, and without continuous refactoring, this is difficult to do without adding unnecessary complexity to the code (which in itself can increase risk of defects).
    • Agile modeling–a methodology for modeling and documenting software systems based on best practice.
    • Pervasive testing–involves the cross-functional participation of all quality and testing stakeholders, both developers and testers
    • Automated code quality analysis–checks source code for compliance with a predefined set of rules or best practices.
    • Definition of done–acceptance criteria that a software product must satisfy in order to be accepted by the user or customer.   This prevents features that don’t meet the definition from being delivered to the customer or user.

This concludes this series of posts on the various challenges that may be encountered on an agile project and the various solutions that can be referred to as troubleshooting possibilities for those challenges.

The next section of this chapter, section 5.4, deals with measurements in agile projects.  This replaces the earned value analysis used on traditional project management projects, and will be the subject of the next series of posts …

 

 

 

Troubleshooting Agile Project Challenges (4)–Clarifying Team Roles and Responsibilities


On pages 58 and 59 of the Agile Practice Guide, there are twenty-one challenges or “pain points” described together with the suggested solution(s) to the problem.   However, they are listed in a random, laundry-list fashion without much rhyme or reason to the order.  So what I have done is reviewed all the suggested solutions and grouped those challenges that require the same type of solution.   These five types of solution are:

  1. Production of agile charter
  2. Product backlog/user story definition
  3. Kanban boards
  4. Focus on team roles/responsibilities
  5. Testing implementation

I have already covered the first three solutions, production of an agile charter, production and maintenance of a product backlog, and the use of kanban boards, in the past three posts.

The production of the agile charter was essential for challenges dealing with the context of the project (what is the product vision?, what is the organization’s mission for doing the project?, etc.).

The production, refinement, and maintenance of the product backlog helped with challenges dealing with the requirements of the product and how these are reflected in the user stories that make up the product backlog.   It also showed how to adjust the user stories that comprise that backlog in order that the team not bite off more than it can chew during any one iteration.

And the challenges dealing with the process of the project, i.e., doing the work and then reviewing it during retrospectives, were met by the use of kanban boards to visualize those processes, which then in turn makes it easier to pinpoint bottlenecks or barriers so they can be focused on and removed.

Today I’m covering the four series of challenges that can be faced by using a common solution, namely, focusing on and clarifying the team’s roles and responsibilities, especially those of the product owner and the servant leader (whatever title that person goes by, such as scrum master, project manager, etc.).

Here are the five challenges that can be faceed by focusing on the team’s roles and responsibliites.

  1. Team struggles with obstacles–The servant leader should be the one focusing on and clearing those obstacles.   The servant leader should create options for the team to choose among.   If the servant leader is unable for whatever reason (such as lack of experience) to remove the obstacles, consider escalating the problem by consulting with an agile coach.
  2. False starts, wasted efforts–This is usually caused by the team’s insufficient understanding of exactly what the project mission is (what exactly is it we are trying to produce).   The product owner needs to be an integral part of these team discussions so that he or she can communicate with the customer and clarify exactly what the requirements are.
  3. “Hurry up and wait”, i.e., an uneven flow of work–This is where the clarification of the roles and responsibilities of the individual team members is important.  Plan to the team’s work in progress (WIP) capacity and not more; even consider reducing that WIP capacity if necessary.   The team should stop multitasking (i.e., working on other projects) and be dedicated to one team.   Have the team members consider working in pairs or even in groups to even out the capabilities across the entire team, and to increase communication between team members.
  4. Impossible stakeholder demands–The servant leader needs to work together with the product owner and stakeholders to clarify the obstacles in meeting the current demands.
  5. Siloed teams, instead or cross-functional teams–If you are getting team members from managers in a specific department that are comfortable working with each other but not necessarily those from other departments, then the servant leader needs to educate the managers on why cross-functional teams are essential to the success of an agile project.    Ask the team members on the project to work together in pairs or even in groups with team members from other departments in order to create create cross-functional teams.

You can see that this set of solutions involves both the strengthening of the roles and responsibilities of the leadership on an agile team (the servant leader and the product owner), and the clarification of the roles and responsibilities of team members (they need to stop multitasking and working alone and move towards inclusion of other team members from other departments on their team).

This will help the team create the work product.   This helps with the scope of the project, which deals with the completeness of the work.   What about the correctness of that completed work, which is where quality control comes in?   For issues regarding this area, see the next and final post in this series on clarification of testing.

Troubleshooting Agile Project Challenges (3)–Using a Kanban Board


I am going over the Agile Practice Guide, a publication put out by the Agile Alliance in conjunction with the Project Management Institute.    I am currently reviewing chapter 5 on the implementation of agile projects, and am now on section 5.3, Troubleshooting Agile Project Challenges.

On pages 58 and 59 of the Agile Practice Guide, there are twenty-one challenges or “pain points” described together with the suggested solution(s) to the problem.   However, they are listed in a random, laundry-list fashion without much rhyme or reason to the order.  So what I have done is reviewed all the suggested solutions and grouped those challenges that require the same type of solution.   These five types of solution are:

  1. Production of agile charter
  2. Product backlog/user story definition
  3. Kanban boards
  4. Focus on team roles/responsibilities
  5. Testing implementation

I have covered the first two solutions, production of an agile charter and production and maintenance of a product backlog, in the past two posts.   In the post about a product backlog, these dealt with challenges to creating the product; in this post about kanban boards, we will discuss those challenges that exist within the process itself.   It turns out that kanban boards are especially useful in dealing with these type of process challenges.

Here are the three (out of a total 21) challenges that can be met by using a kanban board.

  1. Unclear work assignments or work progress–Use kanban boards to visualize the flow of work.   Consider using the kanban board during daily stand-ups, where team members walk the board and see what work they are doing is where on the kanban board.   This will clarify both their assignment and what the next step they need to take in order to be able to move the item they are working on to the next column.
  2. Unexpected or unforeseen delays–Ask the team to check the kanban boards more often.   Have them see the flow of work and WIP (work in progress) limits to understand how these impact the demands on the team and on the product itself.  Add a track to the kanban board for listing impediments and monitor impediment removal on a regular basis.
  3. Slow or no improvement in the teamwork process-Capture no more than three items to be improved at each retrospective.   Have the servant leader use the kanban board to track these three items, and then have the servant leader make sure that the improvements integrated into the overall process.

The kanban board takes the dynamic processes of the project and takes a snapshot of where they all stand so that the team members can clarify the nature of the work assignments, the impediments that prevent these work assignments from going forward, and any improvements in the process of getting those work assignments completed.

The process of setting up a kanban board will be discussed in a later post.

Now let’s go on to the next post covering the last five challenges where clarification of team roles and responsibilities, including those of the product owner and servant leader, can help them get resolved.

Troubleshooting Agile Project Challenges (2)–Production of a Product Backlog


I am going over the Agile Practice Guide, a publication put out by the Agile Alliance in conjunction with the Project Management Institute.    I am currently reviewing chapter 5 on the implementation of agile projects, and am now on section 5.3, Troubleshooting Agile Project Challenges.

On pages 58 and 59 of the Agile Practice Guide, there are twenty-one challenges or “pain points” described together with the suggested solution(s) to the problem.   However, they are listed in a random, laundry-list fashion without much rhyme or reason to the order.  So what I have done is reviewed all the suggested solutions and grouped those challenges that require the same type of solution.   These five types of solution are:

  1. Production of agile charter
  2. Product backlog/user story definition
  3. Kanban boards
  4. Focus on team roles/responsibilities
  5. Testing implementation

This second group of solutions, which centers on the creation of a product roadmap culminating in a project backlog with user stories, covers one third of all the challenges or “pain points” listed by the Agile Practice Guide.

Let’s pull back for a minute and discuss what is involved with a product roadmap.   In an agile project, you start with the product vision, which is how the customer envisions how the product will look and perform.   This vision is of the finished product.   That is like saying you are going on a journey and stating what it at the final destination.   You then have to figure out how you are going to get from here, the starting point, to that final destination.   That is the product roadmap, where you start creating requirements or features that the final product should have.   For each requirement, the team and the product owner should the expectations and value that the requirement represents for the customer.

The result should be a series of user stories which describe how the feature or requirement will fit into the overall final product.   The stories must strive to be as objective as possible, so the acceptance criteria for when the user story is “done” can be easily agreed upon by the customer and the team.

Once the user stories are clarified, and put into the product backlog (the equivalent of the “scope statement” in a traditional project), you can then move on to estimation of how long each story will take.   This takes the laundry list of user stories and ranks or prioritizes them according to their size (how long the completion of the user story will take) and their impact on the overall final product.   Once these factors are weighed, you can then figure out which user stories to work on first–hence the term product roadmap.

Okay, so that’s how the process is supposed to go.   If any of these steps listed above are not done carefully, problems can arise.   The list below shows the challenge or pain point an agile team can experience and its recommended remedy that has to do with the product backlog and the user stories it comprises.

  1. Poor user experience–the user needs to be involved in the process of creating the user stories that make up the product backlog
  2. Inaccurate estimation–reduce the story size by splitting up the user stories.  Use relative estimation involving the whole team in order to estimate the size of user stories.  Consider spiking (doing a focused study) to understand any story that is unclear.
  3. Work delays/cost overruns due to insufficiently refined product backlog items–the product owner and the team should work on user stories together.  Create a definition of “ready” (as in “ready to start work on) for the user stories.  Consider splitting user stories up so that they are smaller.
  4. Work is not complete–Work on the definition of done (i.e., acceptance criteria) for user stories and for the project as a whole.
  5. Too much complexity–For each user story, encourage the team to focus on the question, “What is the simplest thing that would work?”   This uses the agile principle of simplicity, which can always be described as “the art of maximizing the amount of work NOT done.”
  6. Too much rework–measure the work in progress (WIP) at the beginning of the project.  Consider team spikes (doing a focused study) to learn and focus on adding value rather than perfecting the design.    Create a robust definition of done for user stories and shorten iterations if necessary.
  7. Inefficiently ordered product backlog items–Rank the user stories not just by their duration, but also by the value that they will contribute to the final product.

You can see that some common theme of these solutions is to make sure that the entire team including the product owner work on the creation of user stories, to reduce the complexity and/or size of the user stories if possible (especially at the beginning of the project), and to create acceptance criteria (i.e., the definition of “done”) for these stories.  If you can make sure to incorporate these practices in the creation of the product backlog and the management of the user stories that it comprises, many of the issues or challenges listed above can be avoided or at minimized.

The next set of solutions deals with the clarification not of the product (which is what the product backlog is for), but of the process, and Kanban boards are an excellent tool for dealing with challenges in this area.   These challenges addressed by Kanban boards will be the subject of the next post.

Troubleshooting Agile Project Challenges (1)–Production of an Agile Charter


I am going over the Agile Practice Guide, a publication put out by the Agile Alliance in conjunction with the Project Management Institute.    I am currently reviewing chapter 5 on the implementation of agile projects, and am now on section 5.3, Troubleshooting Agile Project Challenges.

On pages 58 and 59 of the Agile Practice Guide, there are twenty-one challenges or “pain points” described together with the suggested solution(s) to the problem.   However, they are listed in a random, laundry-list fashion without much rhyme or reason to the order.  So what I have done is reviewed all the suggested solutions and grouped those challenges that require the same type of solution.   These five types of solution are:

  1. Production of agile charter
  2. Product backlog/user story definition
  3. Kanban boards
  4. Focus on team roles/responsibilities
  5. Testing implementation

This is the first post, so I will cover the four challenges (out of the total 21) that require the production of an agile charter.

Here are the pain points and their suggested solutions.

  1. Unclear purpose or mission for the team–create an agile charter which includes the vision (why is the product of the project being produced–what need does it fulfill), the mission (what benefits does your organization stand to gain by doing the project)
  2. Unclear working agreements for the team–create an agile charter that includes the values of the team, the principles, and the working agreements (how will meetings be conducted, how will people communicate with each other, etc.)
  3. Unclear team context–create an agile charter that explains the boundaries or constraints (e.g., is there are a hard deadline for the project), and the committed assets (both physical resources and human resources).
  4. Unclear requirements–create an agile charter that crafts a product vision (as mentioned above, why the product of the project is being produced).   The team and the product owner (who looks out for the interest of the customers and stakeholders) should clarify the expectations for the product, and decompose this into tracking a list of smaller, concrete requirements.

As you can see, going through the agile chartering process solves a number of problems down the line.    The biggest mistake people have when doing a project is rushing to the planning (design) phase before spending sufficient time in the initializing phase which includes the production of a project charter.

In the next post, the next type of solution will be discussed:   paying attention to the product backlog and the definition of the user stories it contains.   This process solves the largest group of challenges or pain points of any of the five listed above:  eight to be exact.   These eight challenges will be discussed in the next post.

Using Iterations and Increments


I’m going through the Agile Practice Guide on a systematic basis to review all of its contents.   Chapter 5 covers the implementation of a project in an agile environment.

Section 5.2 of this chapter covers the eight most common agile project practices, which are:

5.2.1 Retrospectives

5.2.2 Backlog preparation

5.2.3 Backlog refinement

5.2.4 Daily Standups

5.2.5 Demonstrations/Reviews

5.2.6 Planning for Iteration-Based Agile

5.2.7 Execution Practices that Help Teams Deliver Value

5.2.8 Iterations and Increments Help Deliver Working Product

The first seven practices were covered in earlier posts.   The eighth practice is using iterations and increments to help deliver the working product.

First of all, to understand the context for this practice, you should refer to Chapter 3 which refers to the four types of project life cycles:  predictive, iterative, incremental, and agile.

  1. Predictive is simply the traditional project management life cycle.   The requirements are fixed at the beginning, the activities are ideally performed once for the entire project, and the delivery of the final product being done once at the very end.
  2. Iterative is a project life cycle where the requirements are dynamic (i.e., can change during the course of the project), the activities are repeated to allow feedback on partially completed or unfinished work so that the work can be improved and modified until a correct solution is found.    Like the predictive model, the delivery of the final product is done at the very end.
  3. Incremental is a project life cycle where the requirements are dynamic, and the work is broken down into smaller deliverables.   The activities that are done are repeated for each deliverable.   The big difference here is that the delivery of the project is done incrementally in deliverables that the customer may be able to use immediately.
  4. Agile is a project life cycle which leverages both the aspects of iterative and incremental project life cycles.   On the one hand, an agile team iterates over the product to create finished deliverables.    Feedback is obtained early on from the customer, which gives the customer a voice in what the final product will look, feel and perform.

This is why the practice listed here is how to use both iterations and increments to help design a better product.

For iterations, it is best to refer to the first practice on the use of retrospective meetings which serve to punctuate the flow of work in order to create a cadence for delivery and a time for reflection as a group to give feedback on how the process is going.

For increments, it is best to have a process in place where

a) the testing is done by the group to assure that the deliverable achieves the user story that it was based on.   This is the equivalent of scope verification in a traditional project management framework

b) the customer receives the deliverable and is given a demonstration.  The customer gives feedback on whether, from the customer’s standpoint, the deliverable achieves the user story as promised.   This step in the process can be problematic if the customer does not know beforehand what the user story is, or has an understanding of it that differs from that of the agile team.    This is why the user story has to be written in an objective manner, so that subjective interpretations of what it means can be eliminated as much as possible.    This is the equivalent of scope validation in a traditional project management perspective.

c) The team then takes the feedback from the customer on how the product looks, feels and performs, and then in the next retrospective they will either continue on to the next deliverable, or if there is negative feedback, they will discuss how to adapt the product in order to have it conform to the customer’s expectations.

So the feedback created at the end of increments  are used as the raw data which drives the analysis done during the retrospectives.   This is why agile is considered a project life cycle in which iterations and increments work together hand in glove.

This concludes this series of posts on how to execute an agile project.

The next section, section 5.3, discusses troubleshooting agile project challenges.   In a manner typical for PMI, these 21 challenges or pain points are listed in a laundry-list fashion, with no clear thematic group (at least that I can see).   So I am going to analyze these 21 challenges and see which can be grouped together based on the nature of the troubleshooting possibility (the creation of an agile charter, the development of a product backlog, the role of the servant leader or product owner, etc.).   The first post will be those agile challenges that can be remedied by creating an agile project charter.