The Bug Trends (or Defect Trends, Bug Dynamics) report in Software Testing calculates total number of bugs that the QA team has opened and closed over time. The reports are important for management to understand the overall trend at which the bugs are processed. Moreover the report may help to check whether bug discovery and resolution rates are declining toward the end of the iteration as expected.
The totals are based on the per intervals of time (such as daily, weekly, monthly), per QA Build or per time-based work scope (Release, Iteration, Sprint).
Visual Reports for Bug Trends and Dynamics in Targetprocess help you to create visual representation of Bug Trends.
Moreover list of bugs used as data source may be filtered (such as per Project, per Feature, per Owner or Assignment, per Tag etc.) and split into categories. As a result multiple separate trends per each category can be presented in the same report or dashboard.
Sparkline charts is one more tool for tracking Bug Dynamics briefly. The charts can be added to Project card and Team cards on your views, and provide you with bird's eye overview only number of new and closed bugs within the Project or Team during last 16 weeks.
To create any report described below, start with + Create > Report action in the left menu.
Create Report from Template
There are predefined Templates for Visual Reports available. In QA section, two of them named New Bugs Trend and Closed Bugs Trend help to create Bug Trends reports just in few clicks.
Here is how predefined Closed Bugs Trend report is configured:
Configure Report Manually
On Setup tab it is possible to configure similar report manually:
|Name||Closed Bugs Trend|
|Projects and teams||Mark with checkboxes projects and teams for which Bug Trends should be displayed in the report|
|Filter||Apply an Advanced Filter to Bugs. For instance, for Closed Bugs Trend only closed bugs should be filtered. In the default template we filter bugs closed within last 182 days only.
?EndDate >= Today - 182(days)
|X axis||In the default template we group closed Bugs per week:
|Y axis|| In the default template we display count of closed Bugs per Project:
Count of Records, Project
|Chart type||Line chart|
Press Finish setup button in the header area. The report appears in the left menu.
Configure Advanced Reports
The reports can be easily customized. Press Set up Report button for this purpose.
We recommend to add Trendline to your report.
Settings for Open Bug Trends and Closed Bug Trends
Settings for Open Bug Trends and Closed Bug Trends are not the same. Here are the differences between them:
|Open Bug Trends||Closed Bug Trends|
|Filter||?CreateDate >= Today - 182(days)||?EndDate >= Today - 182(days)|
|X axis||Create Date||End Date|
Bug Dynamics reports are based on Create Date and End Date fields. Create Date field is filled in with timestamp of the current moment whenever a new Bug is created. End Date field is filled in with timestamp of the current moment when a Bug is moved to its final workflow state.
On X axis it is possible to apply grouping by Day, Week, Month, Quarter, Year.
Alternatively, on X axis it is possible to calculate totals per QA Build or per time-based work scope (Release, Iteration, Sprint) instead of dates.
Open Bugs Trend per Build
|Filter||?Build is not None and Build.BuildDate >= Today - 182(days)|
|X axis Sort||Ascending, by Create Date|
Closed Bugs Trend per Release
|Filter||?EndDate is not None and Release.StartDate >= Today-182(days)|
|X axis Sort||Ascending, by Release Start Date|
Closed Bug Trends distribution
Data in the report can be distributed by categories. For example, let's create multi-color stacked chart for Bug Trends by Severity. For this purpose put Severity field to the Color selector:
As a result in the report closed bugs with different severities become shown with different colors. Legend with explanation is also displayed:
Open and Closed Bug Trends on the same chart
It is possible. Here are examples how the report can be configured:
|Filter||?CreateDate >= Today - 182(days) or EndDate >= Today - 182(days)|
|X axis||We could use default Create Date and End Date fields here, but it will result in quite long trend for bugs created in the past. We are not interested in the past part of the trend because we are interested in last 182 days only. To improve our chart let's trim the tail and count all bugs created earlier than 182 days ago together.
We'll create custom formula TrimmedCreateDate and put it to the axis together with End Date field, grouped by week:
Here is how custom formula TrimmedCreateDate is configured:
|Y axis||Count of Records|
Here is how the report looks in Line chart mode:
Same chart can be also displayed in Stacked Bar mode:
Data in the report can be distributed by categories. For example, let's create line chart for Bug Trends by Severity. For this purpose put Severity field to the Y axis together with Count of Records field:
In the non-project management world, sparkline graphs work as quick time-span mini-reports featuring the dynamics of certain states or activities. The sparkline graphs usually look like tiny ragged lines.
Sparkline charts can be added to Project card and Team card within Customize Cards feature.
Here's the logic behind the sparklines for user stories and bugs:
Any sparkline covers a time span of 16 weeks. Why 16 weeks? It's convenient in terms of iterations and releases, as in agile project management iterations usually take 2 weeks. So, this stands for ~ 8 most recent iterations. If you count each tiny sub-bar, this would total to 16 (in the sparkline for Designers and Philadelphia Flyers teams. The Support team has less than 16 weeks history, most likely). The 16 weeks time-span seems to be working well for iterationless development, too, as in Kanban.
The zero line is the vague blue line in the middle (or, the red one, for bugs).
The actual numbers shown on the top and on the bottom represent the max. number of user stories done or added, respectively, at any given week out of those 16. If you see the max. number, for a week that goes shortly after the 4th week, then you get an idea of how many user stories have been done in the previous or in the following weeks. There's no need to show the numbers for each week, as it would make this tiny graph too clogged. The same logic stands true for bugs.
The numbers on the very right, in the sparkline, relate to the current week.
The "total" numbers show how many user stories were done in those 16 weeks (on the top) and how many were added (at the bottom). Sames for bugs: the total fixed number is on the top, and the total added — at the bottom.
If you're a stakeholder, or a Scrum Master, or VP of Development, or anyone who wants to keep an eye on how those several teams are doing, this report is indispensable. With the least time spent, you get the maximum info. And - yes - there's no recency bias. You are able to embrace the whole span of 16 weeks in that little space.
Finally, this logic is condensed to a mouse-over text tooltip.
Interpreting the Reports
The team might find bugs especially quickly in poorly written code, in newly integrated code, with improved testing, or during an exceptional event. On the other hand, bugs are more difficult to find in a high quality product and with ineffective testing.
When the team resolves bugs faster than it finds them, the number of active bugs will start to decrease. When the team starts to find fewer bugs, the product is stabilizing.
The reports and sparklines are not only about the counts of bugs and user stories. For example, if you see that your sparkline for user stories is keeping low, close to zero, and your bug numbers are pulsating with action, this means that this team is currently working to make the release stable (most probably).