Você está na página 1de 5

Towards Model-based Acceptance Testing for Scrum

Renate Löffler, Baris Güldali, Silke Geisen


Software Quality Lab (s-lab)
Universität Paderborn
Warburger Str. 100, Paderborn, Deutschland
[rloeffler|bguldali|sgeisen]@s-lab.upb.de

Abstract consistent, which causes problems if software has to be


contractually accepted by the customer. Thus, it is a
In agile processes like Scrum, strong customer involve-
challenge to establish better requirements specification
ment demands for techniques to facilitate the require-
techniques that overcome the addressed difficulties.
ments analysis and acceptance testing. Additionally,
Another challenge in Scrum is the continuous inte-
test automation is crucial, as incremental development
gration of new functionalities implemented during the
and continuous integration require high efforts for test-
sprints. Thereby, integration testing has to be con-
ing. To cope with these challenges, we propose a model-
ducted in order to check the correct interaction of the
based technique for documenting customer’s require-
new functions with the old ones. Additionally, regres-
ments in forms of test models. These can be used by
sion testing has to be conducted for checking whether
the developers as requirements specification and by the
the integration of new functions has corrupted the old
testers for acceptance testing. The modeling languages
functions. Depending on the number of sprints and the
we use are light-weight and easy-to-learn. From the
functions to be integrated, the testing efforts can ex-
test models, we generate test scripts for FitNesse or Se-
plode. Thus, there is an urgent need for automating
lenium which are well-established test automation tools
testing activities, e.g. test case generation, test execu-
in agile community.
tion and evaluation of test results.
Keywords In this paper, we propose a model-based technique
for coping with the challenges addressed above. First,
Agility, Scrum, Acceptance Testing, Test Automation, we want to improve the communication and the doc-
Model-based Testing umentation of customer requirements by using simple
UML [8] models. These models are created together
1 Motivation
with the customer and they describe the system func-
Over the last years, agile methods have become more tions on an abstract level. The modeling languages we
popular. Several methods like Scrum, Feature Driven use are so intelligible that we believe that the customer
Development (FDD) [3], and Extreme Programming will be able to learn and edit them easily. Afterwards,
(XP), have been developed with the aim to be able these initial models are refined during the sprint plan-
to deliver software faster and to ensure that the soft- ning with technical details and test data. Then, these
ware meets the customer’s changing needs [2]. They all are used by the developers as a specification of required
value the agile manifesto [1] and share some common functions.
principles: Improved customer satisfaction, adopting Secondly, we want to automate the testing activities
to changing requirements, frequently delivering work- by generating test scripts from the models. The refined
ing software, and close collaboration of business people models contain enough information for generating test
and developers [2]. scripts for FitNesse [7] or Selenium [11], which are well-
At the moment, Scrum [10] is the most used method; known and widely used acceptance test tools in the agile
about 50 percent of business companies are using the context.
method itself or a hybrid of Scrum and XP [14]. The With our testing approach we follow the following
main focus in Scrum is on the implementation rather ideas of Utting and Legeard [6, 13]: (1) model-based
than on a detailed analysis or proper documentation. testing must be embedded into agile processes and ex-
The communication of requirements and the definition isting tool chains; (2) there is a need for dedicated do-
of new desired features, which are added to the existing main specific modeling languages for making behavioral
product backlog, occur during the iterative cycles which modeling easier and user-friendly; (3) model-based test-
are called sprints [10]. This leads to poor requirements ing must be linked to the requirements analysis.
specifications, i.e. requirements are incomplete and in-
The idea of using models in agile development is not new functionalities are delivered and presented in the
new. Scott Ambler introduced Agile Modeling, which is review meeting to the product owner.
a collection of best practices and principles. He states During the sprint many testing activities have to be
that agile modeling needs simple but multiple models conducted: (1) new functionalities have to be tested
one of which fulfills a concrete purpose. for correctness (unit testing); (2) the correctness of
Our paper is structured as follows: next section gives old functionalities has to be re-tested, if changes are
a brief overview on the activities and artifacts in Scrum. done (regression testing); (3) the integrated system of
Section 3 explains our model-based testing approach new and old functionalities have to be tested (integra-
for acceptance testing in detail and gives an example tion testing). If these activities are conducted on the
how test scripts can be automatically generated from end-user interface (e.g. graphical user interface) and if
models. After addressing some related work in section they test the software against the user requirements, we
4, we give in section 5 an outlook for further research speak of acceptance testing. Before the sprint has been
in this area. finished, testers have to assure that the new function-
alities integrated into the software work correctly.
2 Scrum In order to efficiently conduct the testing activities
Scrum was invented by Jeff Sutherland, Ken Schwaber listed above, test automation is needed. Next section
and Mike Beedle and is an empirical agile project man- will explain how we want to automate test case gener-
agement framework [10]. Scrum implies roles, artifacts ation and test execution by using abstract models.
and activities which are mainly meetings. Figure 1
shows the overall process of Scrum based on [4]. The
3 Model-based Acceptance Testing
elements in boxes (IOD, etc.) will be introduced in sec- We extend the Scrum process with a model-based
tion 3. testing approach, where models are used for captur-
The main artifact in Scrum is the product backlog, ing customer requirements and for acceptance testing.
which is a prioritized feature list. This list is specified Thereby we want to reach a two-step improvement
by the product owner, who creates, controls and man- for Scrum: First, we want to systemize test design by
ages the backlog. Originally, Scrum defines no special capturing customer requirements in forms of models,
requirement engineering techniques for the creation of which subsequently are used for test case generation.
the product backlog but it is suggested to use so-called Secondly, we want to automate test execution by using
user stories[4]. a suitable test tool, e.g. FitNesse or Selenium.
For the first point, we add new model artifacts to
IOD
the Scrum process as shown in Figure 1. As model-
IOD
Product Backlog IOD ing notations, we use Interaction Overview Diagrams
(IOD) and Sequence Diagrams (SD) defined in UML
IOD 2 [8] where the required functionalities are specified as
Sprint  Planning 1 Selected Product  Sprint  Planning 2 user stories. These notations are light-weight and easy-
Backlog to-learn for the customer or the product owner to cre-
ate the user stories. In addition to IOD and SD, we
IOD
define a Fixture [7] which is a domain-specific language
SD
Review Sprint Backlog for specifying the interface between the user and the
Fixture software system.
Secondly, we refine the sprint activities by using the
new model artifacts as shown in Figure 2. Here, we have
New Functionalities Sprint two roles in the Scrum team: developer and tester. The
developer gets the IODs and SDs as a specification of
the required functionalities. He extends the SDs with an
Figure 1: The Scrum process extended with model ar- Interface Specification (IS) and implements the required
tifacts functionalities. Tester extends the SDs with Test Data
(TD). All model artifacts are used for automatic test
After creating the product backlog, the main Scrum case generation resulting in test scripts which can be
flow starts. The product owner and the Scrum team automatically executed on the system under test (SUT).
meet for the planning of the first iteration, called sprint. In the following, we want to give more details on the
The first features are chosen from the product backlog, two steps of our approach and illustrate them by using
and the team splits the features in smaller tasks which a small online store example.
results in the sprint backlog. At the end of each sprint,
Tester S i
Sprint actions on a web page. These keywords should be then
consistently used as message names for the SD.
Developer The result of the second Sprint Planning is the Sprint
M d l
Models
Backlog, which are the selected IODs, the designed SDs
IOD SD
knows and the chosen Fixture.

IS TD 3.2 Implementation and Testing


After having documented functional requirements as
implements models, the Sprint can start where the Scrum team im-
generate plements the Sprint Backlog. Developers and testers
can work in parallel. While the developers meet tech-
execute
nical decisions like choosing libraries or specifying the
Test cases SUT
interface (IS), testers can focus on the selection of test
data (TD). The IODs and SDs are used as test models
by the testers for generating test scripts.
Figure 2: MBT activities embedded into the Scrum The IS comprises all information that is specific to
sprint the implementation, e.g. a specific field name or button
name. In our example shown in Figure 3.c, the solid
boxes belong to IS which specify the name of the select
3.1 Creating Models field as quantity and the name of the submit button as
First, the product owner creates user stories, which are submit.add-to-cart.
specified with IODs (Figure 1). In Figure 3.a, the IOD The TD contains the test input and some verifica-
for a user story describing how to buy a book in an on- tion steps as shown as dashed boxes in Figure 3.c. For
line store is shown. Every activity is given a name that selecting test inputs well-known techniques can be used
describes the activity in the user story. The product (e.g. equivalence classes, boundary analysis). In the
owner thus describes the backlog items on a high level, example, the TD for the quantity contains the values 1,
while still specifying the order of activities and their 2, 3, 10. The expected results are specified also using
dependencies. the Fixture language. The verification step in Figure
In the first Sprint Planning, the product owner and 3.c checks whether the quantity given by the user does
the team choose which backlog items they want to real- appear correctly in the shopping cart.
ize in the first sprint. Here, the IODs give a good hint For automating test execution, we have developed
on the complexity of the user stories. If the IODs are a prototype test case generator as a plugin for Fu-
too big to be realized in one sprint, they can simply jaba4Eclipse [12]. Test scripts are generated from the
choose a subset of paths from the starting to the end- test models – also edited with Fujaba4Eclipse – which
ing node, leaving out additional branches. In our exam- comprise IOD, SD, IS and TD. In our approach, we use
ple, they could choose the path OpenAmazonBookLink, a test table representation of test scripts, which can then
BuyItem, Login, Confirmation for the first sprint and be automatically executed by test tools like FitNesse or
OpenAmazon, SearchBook, NavigateToBook for follow- Selenium.
ing sprints. Figure 4 shows an exemplary test table for scenario
The second Sprint Planning is used to refine the se- <OpenAmazon, SearchBook, BuyItem>. The genera-
lected Product Backlog Items. In our approach, this tion works as follows: First, the algorithm finds paths
means the team refines the activities in the IOD by in the IOD according to a coverage criterion (activity,
adding SDs to them. In Figure 3.b, the SD for BuyItem edge, or basic path coverage). Second, for every SD
is shown. It describes what interaction happens when along a path, the algorithm creates a test table. This
conducting this activity, that is the user selects the is done according to the Fixture definition from Sprint
quantity and clicks on the button to add the item to Planning 2, where it is defined how the rows are built
the shopping cart. The browser then shows the shop- for every keyword.
ping cart. For the Fixture keyword select, the according mes-
Before starting to design the SDs, the team should sage is generated to the following row:| select | quan-
first decide on a Fixture which defines a unified domain- tity | 2 |. The first column is the message name con-
specific language to specify the interactions. Since our forming to the used Fixture, the second the IS anno-
example is about an online store, we use a Fixture for tation, and the third a value from the TD table. The
web-based interaction defined by Selenium [11], which methods on the system lifelines (show(ShoppingCart)
defines keywords (e.g. select, clickAndWait) to express on Browser ) specify the expected behavior of the sys-
a. b.

u1:User b1:Browser

OpenAmazon
select(quantity,value)

clickAndWait(addToCart)

SearchBook NavigateToBook OpenAmazonBookLink show(ShoppingCart)

c.
value
BuyItem
1
u1:User b1:Browser 2
5
quantity
Login select(quantity,value) 10
${value}
clickAndWait(addToCart)

Confirmation submit.add-to-cart show(ShoppingCart)

verifyTextPresent | quantity: ${value}


verifyText | link=Sherlock Holmes | Sherlock Holmes

Figure 3: a. Interaction Overview Diagram describing a user story. b. Every activity is later linked to a Sequence
Diagram. c. Sequence Diagram with annotations.

The result of the sprint is an implementation that


fulfills the selected product backlog ready for shipment.
After this, the whole Scrum life cycle can start again.

4 Related Work
There is already some work on combining model-based
testing with the agility. Katara and Kervinen [5] use a
use-case driven approach, where their methodology is
built around a domain-specific modeling language with
action words and keywords. For behavioral modeling
Figure 4: Exemplary test table for Selenium
they use labeled transition systems (LTS) which enable
a comprehensive specification of desired functionality.
tem which is enhanced by the tester with some verifi- Using a coverage language they control the test space.
cation steps. Here, the algorithm generated two rows, We believe that Scrum requires more user-friendly mod-
one of which is | verifyTextPresent | quantity: 2 |. In eling notations than LTS and light-weight models in or-
our example, we check that the quantity given by the der to involve the customer into the process.
user is correctly displayed in the shopping cart. We use Rumpe [9] proposes using UML models and model
facilities of Selenium to parameterize test tables thus transformation as centric concepts for agile processes
iterating test execution for each entry in TD. especially for testing activities. However, the proposal
When the test tables are generated, the team can au- is in a very early stage and introduces no concrete tech-
tomatically run the tests. The test results show which niques how to handle the challenges of Scrum. The au-
test tables passed and which failed. When the imple- thors say “Neither are the tools ready for major prac-
mentation is fixed, all tests can be repeated very effi- tical use, nor are semantically useful transformations
ciently because they are automated. understood in all their details. Neither the pragmatic
methodology, nor the underpinning theory are very well
explored yet” [9]. In our paper, we propose very con-
crete techniques for model-based testing. Our proto-
type shows that these techniques are realistic and ap- [7] Robert C. Martin, Micah D. Martin, and Patrick
plicable. Wilson-Welsh. FitNesse - Acceptance Testing
Framework. http://fitnesse.org/ (last visited:
5 Conclusion and Outlook 28.04.2010, 2008.
We have introduced a model-based approach for im-
[8] OMG. UML 2.0 Superstructure Specification.
proving requirements specification and acceptance test-
Technical report, OMG, July 2005.
ing in Scrum. We use light-weight modeling notations
of UML for specifying user stories from the beginning [9] Bernhard Rumpe. Agile test-based modeling. In
of the Scrum process. During Sprint planning the user Software Engineering Research and Practice, pages
stories are enhanced with implementation details thus 10–15, 2006.
serving as a specification for developers. Testers en-
hance them with test data and generate automatically [10] Ken Schwaber and Mike Beedle. Agile Software
test tables using a prototype test generator. Test ta- Development with Scrum. Prentice Hall, Upper
bles can be then executed by Selenium which is a well- Saddle River, 2002.
established tool for acceptance testing.
[11] ThoughtWorks. Seleniumhq - web application test-
With our approach, we combine the principles of the
ing system. http://seleniumhq.org/ (last visited:
agile manifesto with the techniques of agile modeling
28.04.2010, 2004.
and model-based testing. Our prototype shows that the
introduced concepts can be supported by tools through [12] University of Paderborn, Software En-
the whole Scrum process. gineering Group. Fujaba4eclipse,
The presented approach is in an early stage. Even if September 2009. http://wwwcs.uni-
we believe that the modeling notations are easy-to-use paderborn.de/cs/fujaba/projects/eclipse/index.html.
for the product owner and the customer, this assump-
tion must be proven by an industrial case study. Thus, [13] Mark Utting and Bruno Legeard. Practical Model-
the efficiency and the effectiveness of the approach must Based Testing: A Tools Approach. Morgan Kauf-
be evaluated. Also, the integration of the test generator mann, 2007.
into existing tool landscape must be evaluated.
[14] VersionOne. State of Agile Development Survey
References 2009. Technical report, November 2009.

[1] Various authors. Manifesto for Agile Software De-


velopment. http://agilemanifesto.org/ (last vis-
ited: 28.04.2010, 2001.

[2] Armin Eberlein, Frank Maurer, and Frauke


Paetsch. Requirements Engineering and Agile Soft-
ware Development. In 12th IEEE International
Workshops on Enabling Technologies (WETICE
2003), pages 308–313, 2003.

[3] John M. Felsing and Stephen R. Palmer. A Prac-


tical Guide to the Feature-Driven Development.
Prentice Hall International, 2002.
[4] Boris Gloger. Scrum. Hanser Verlag, 2008.

[5] Mika Katara and Antti Kervinen. Making Model-


Based Testing More Agile: A Use Case Driven Ap-
proach. In Haifa Verification Conference, pages
219–234, 2006.
[6] Bruno Legeard. MBT for Large-Scale Enterprise
Information Systems Challenges and Quality Is-
sue. In Key-note AMOST&QuoMBaT@ICST2010,
April 6th, 2010.

Você também pode gostar