An impact-driven approach to predict user stories instability

Yarden Levy, Roni Stern, Arnon Sturm, Argaman Mordoch, Yuval Bitan

Research output: Contribution to journalArticlepeer-review

6 Scopus citations

Abstract

A common way to describe requirements in Agile software development is through user stories, which are short descriptions of desired functionality. Nevertheless, there are no widely accepted quantitative metrics to evaluate user stories. We propose a novel metric to evaluate user stories called instability, which measures the number of changes made to a user story after it was assigned to a developer to be implemented in the near future. A user story with a high instability score suggests that it was not detailed and coherent enough to be implemented. The instability of a user story can be automatically extracted from industry-standard issue tracking systems such as Jira by performing retrospective analysis over user stories that were fully implemented. We propose a method for creating prediction models that can identify user stories that will have high instability even before they have been assigned to a developer. Our method works by applying a machine learning algorithm on implemented user stories, considering only features that are available before a user story is assigned to a developer. We evaluate our prediction models on several open-source projects and one commercial project and show that they outperform baseline prediction models.

Original languageEnglish
Pages (from-to)231-248
Number of pages18
JournalRequirements Engineering
Volume27
Issue number2
DOIs
StatePublished - 1 Jun 2022

Keywords

  • Agile software development
  • Machine learning
  • Requirements
  • User story

ASJC Scopus subject areas

  • Software
  • Information Systems

Fingerprint

Dive into the research topics of 'An impact-driven approach to predict user stories instability'. Together they form a unique fingerprint.

Cite this