Você está na página 1de 44

An AUTOMATED TESTING INSTITUTE Publication - www.automatedtestinginstitute.

com

A
S
utomated .......
T
oftware esting MAGAZINE
DECEMBER 2009 $8.95

ATI Automation
Honors Award Winners Spotlighted!

ATI’s
Special
Coverage
of the STPCon
2 Automated Software Testing Magazine www.automatedtestinginstitute.com December 2009
A
S
utomated
T
oftware esting
December 2009, Volume 1, Issue 4

contents
STP Conference Coverage
STP Conference Revealed 8
This article provides a detailed review of the 2009 Software Test & Performance Conference.

Test automation Map: An STPCon Sample Walkthrough 16


Experience or re-experience the STP Conference by walking through a potential track session path.

STPCon Numerology 18
Learn more about the conference by exploring key numerical values associated with it.

At The STPCon with Dan Downing 20


Learn about Data Patterns in performance testing in this interview with STPCon presenter, Dan Downing.

ATI Honors Coverage 23


Best Open Source Automated Test Tools 24
The finalists and winners of the open source unit, functional and performance tools category are highlighted.

Best Commercial Automated Test Tools 32


The finalists and winners of the commercial unit, functional and performance tools category are highlighted.

Best Automated Testing Book 38


The finalists and winners of the book category are highlighted.

Best Automated Testing Blog 40


The finalists and winners of the blog category are highlighted. The Automated Software Testing (AST) Magazine is
an Automated Testing Institute (ATI) publication that
Best Automated testing Forum 42 serves as a companion to the ATI Online Reference.
For more information regarding the magazine visit
The finalists and winners of the forum category are highlighted.
http://www.astmagazine.automatedtestinginstitute.
com

December 2009 www.automatedtestinginstitute.com Automated Software Testing Magazine 3


Editorial

And The Winner Is... Us!


by Dion Johnson

Information is power.
Good information is the ultimate
power!
This is a quote that...well...I
just made up. It doesn’t make it any
less true, though, and that is why the
Automated Testing Institute (ATI) is
devoted to ensuring the test automation
community has good information at all
costs.
For this reason, ATI has produced
this special December edition of
the Automated Software Testing
(AST) Magazine. Happy Holidays!
This holiday gift that we refer to as The STPCon took place in the conference presenters in which he
‘Volume 1, Issue 4’ is a special issue October of this year, and I was able provides engaging details about his
that departs from business as usual. to attend as a roving reporter, ready track session.
The articles provided are not on to report to all of you how it went. The 1st Annual ATI Automation
automation concepts and techniques, The STPCon proved to be a resource Honors winners were previously
but rather on various sources of that provided a significant amount of announced in a special video
such content. I know it’s
presentation found at
probably hard to believe
h t t p : / / w w w. a t i h o n o r s .
that ATI is not the only
automatedtestinginstitute.
source of test automation
com. If you’re interested in
content, but there actually
seeing finalists announced,
are others! And as much
as we’d like to keep you -- winner acceptance speeches
the automation community delivered and some relatively
-- to ourselves, to deny nice visual effects, I’d
you information regarding definitely encourage you
other strong sources of test to check it out. If you’re
automation content would interested in a detailed
violate the very principles description of the winners,
on which ATI is built and then this issue is for you.
that make ATI appealing
to so many. So, we first It’s likely that the
set our sights on providing special STPCon and the ATI Automation
test automation content, so this AST
coverage of the 2009 Software Test & Honors winners will benefit from
issue provides a detailed assessment exposure offered by this issue, but
Performance Conference (STPCon),
of that content that you won’t want the disemination of good information
and then on providing information
about the finalists and winners of to miss. In addition, you will find key about automation resources may
the 1st Annual Automated Testing take-aways from track presentations provide benefit to all of us if it leads to
Institute Automation Honors awards. as well as an interview with one of improved automation practices.

4 Automated Software Testing Magazine www.automatedtestinginstitute.com December 2009


For more information, visit http://www.networking.automatedtestinginstitute.com
December 2009 www.automatedtestinginstitute.com Automated Software Testing Magazine 5
Authors and Events

Who’s In This Issue?

STPCon The Software Test &


A
S
utomated
T
oftware esting
Performance Conference is a leading event
Managing Editor
for the software test and QA community. Dion Johnson
The conference brings together software Contributing Editor
developers, development managers, test/QA managers and senior Edward Torrie

test professionals to learn and discover the latest solutions to your Director of Marketing and Events
Christine Johnson
most pressing challenges. Held from October 19 through October
23 in Cambridge, MA, the conference offered in-depth training A PUBLICATION OF THE AUTOMATED
TESTING INSTITUTE
sessions, numerous track sessions, and much more. Find detailed
coverage of the conference in this special issue of AST; coverage
that includes a conference assessment, track session walkthrough
and an interview with one of the conference presenters.

CONTACT US
AST Magazine
astmagazine@automatedtestinginstitute.com

ATI Online Reference


1st Annual ATI Automation contact@automatedtestinginstitute.com

Honors The ATI Automation Honors are


awards dedicated to celebrating excellence
ATI and Partner Events
in the discipline of software test automation.
These awards honor open source unit, Now!
ATI Honors Video Presentation
Winners Announced: December 4

functional, and performance automated test www.atihonors.automatedtestinginstitute.com


tools, commercial unit, functional and performance automated
test tools, automated testing books, automated testing blogs, and Now!
Special December Issue of AST
automated testing forum sites. The nominations have been made, www.astmagazine.automatedtestinginstitute.com
the votes cast, and the winners announced in a special video
presentation found at http://atihonors.automatedtestinginstitute.
com. Now, you can learn more details about the winners of these
awards in this issue.

6 Automated Software Testing Magazine www.automatedtestinginstitute.com December 2009


Winners Announced: December 4
December 2009 www.automatedtestinginstitute.com Automated Software Testing Magazine 7
8 Automated Software Testing Magazine www.automatedtestinginstitute.com December 2009
Software Test &
Performance (STP)
Conference Revealed
A Conference Review
by Dion Johnson

The 2009 Software Test & however, evidenced by the conference’s first of the two tracks, provided topics
Performance Conference (STPCon) had strong content that was delivered in related to improving the efficiency of
it all! Murder! Mayhem! Intrigue! Even two distinct parts: training courses and your organization through simple, low
the President of the United was there! the core conference sessions. First, the cost automated test tools and techniques.
Well… Maybe that’s a bit of a stretch. conference offered 10 in-depth half, With the second of the two tracks, aptly
There wasn’t actually any murder, and full and multi-day training sessions named ‘Performance Testing’, STP
the only mayhem involved attempts to with 60% of them being dedicated to stayed true to the ‘Performance’ portion
find good, inexpensive parking around automation/performance testing. Then, of its moniker by offering a robust group
town. And President Obama wasn’t the core conference package offered a of performance related topics. While
actually at the conference; he was variety of track sessions organized into performance topics can sometimes
around the corner at Massachusetts the following five tracks: be focused on manual activities, ATI
Institute of Technology (MIT) • Agile Testing (AG) considers nearly all performance
delivering an address on energy during topics to be inherently associated with
one of the days on which the conference • Test Automation (AU) automation, because performance
was held. If you’re into test automation, • Performance Testing (P) testing is almost completely dependent
however, the STPCon did deliver on on tools in some way, shape or form
• Test Management (M)
being intriguing. (network resource monitors, load/stress
• Future Test (FT) tools, etc.). A third track, ‘Future Test’,
STPCon Structure While all of the tracks are important held the potential for added focus on
The conference, held from October from a general testing perspective, we at automation, given that the STP brochure
19 through October 23 in Cambridge, the Automated Testing Institute attended asserted that it would cover topics
MA, was the organization’s first since for the sole purpose of assessing the addressing, “the latest tools, techniques
it went from being Software Test & conference from a test automation and methodologies.” With the exception
Performance (STP) to the newer, perspective. What else would you expect of one presentation that was also
sleeker Software Test & Performance from the Automated Testing Institute? jointly categorized in the ‘Performance
Collaborative (STP Collaborative). From this perspective, the STPCon Testing’ track (“Performance Testing:
The addition of ‘Collaborative’ did offered much to be assessed, with 40% (2 Cloud Computing’s Killer App” by
nothing to blunt the organization’s focus out of 5) of its tracks being dedicated to Daniel Bartow), however, the ‘Future
on ‘Software Test & Performance’, test automation! ‘Test Automation’, the Test’ track leaned more into the general

December 2009 www.automatedtestinginstitute.com Automated Software Testing Magazine 9


2009 STPCon Automation
Yearbook

Scott Barber Daniel Bartow Ross Collard Dan Downing

Linda Hayes Douglas Hoffman Eric Pugh Bj Rollison

The Automated Testing Institute presents


the 2009 STPCon class of presenters from
The ‘Test Automation’ and ‘Performance
Testing’ Core Conference tracks.

10 Automated Software Testing Magazine www.automatedtestinginstitute.com December 2009


December 2009 www.automatedtestinginstitute.com Automated Software Testing Magazine 11
Automation Track
The ‘Test Automation’ track
was slightly more nuanced than the
‘Performance Testing’ track because
its focus was on more than one type of
automation. When booking speakers for
this track, STP seemed to subscribe to a
similar philosophy held by ATI, which
asserts that test automation is more than
just scripting of functional regression
tests; test automation is tool support for
the entire lifecycle. This is inferred by
the inclusion of presentations covering
Figure 1: ‘Performance Testing’ Topics and Titles functional automation as well as
Continuous Integration (CI). While CI
‘Test Management’ direction as opposed world scenarios, effective use of live does talk about the use of automated
to the tools/automation direction. data for performance testing, effective functional regression test scripts, its
generation and use of metrics, and focus is on automation of the build
With two full tracks dedicated to
how to streamline your performance process.
test automation, though, there was
plenty of good content for automation testing through better decision making Overall the conference did a good
practitioners to choose from throughout and more efficient use of time and job handling the nuanced nature of

These topics helped to make the ‘Performance


These topics helped to make the ‘Performance
Testing’
Testing’track
track fairly robust,which
fairly robust, whichisis
what
what you’dexpect
you’d expect from
from a
a conference
conferencewith
with
“Performance” in its name
“Performance” in its name

the conference. Delivering this content resources. These topics helped to make this track, but there were some hits-
were several faces that might be familiar the ‘Performance Testing’ track fairly and-misses. The positives include the
to anyone that has previously attended robust, which is what you’d expect from fact that a broad range of topics were
testing conferences and/or read testing a conference with “Performance” in its covered. Topics including: the effective
literature. While there were many big name. implementation of your automated tests,
names there (James and Jon Bach, effective assessment of test results,
Michael Bolton, Rex Black, etc.) the
cast of characters that brought the
automation content included Linda
Hayes, Scott Barber, Bj Rollison, Dan
Downing, Douglas Hoffman, Eric Pugh,
Ross Collard, and Daniel Bartow.

Performance Track
The ‘Performance Testing’ track
was pretty straight-forward in that it
had a singular focus – performance.
Each track approached this topic
from a different dimension, however,
including techniques for creating real Figure 2: ‘Test Automation’ Topics and Titles

12 Automated Software Testing Magazine www.automatedtestinginstitute.com December 2009


on its conference cake. One such
‘Test Automation’ element was humorously named “Speed
Track Geeking.” This rapid fire set of short
presentations all took place in one
Hits room, but instead of the attendees going
to the presenters, the presenters went to
• Good information the attendees! Conference attendees sat
• Expert presenters at one of a group of tables organized
• Diverse content in a conference ballroom while each
participating conference presenter
misses rotated to each table giving a five- to-
ten minute version of one of their
• Misplaced tracks presentations with nothing more than
• Could use more takeaways a flip chart and their own creativity to
help them out. When the time was over,
creating a structured framework, and the presentations could’ve included more
the alarm sounded and the presenters
effective management of the automation step-by-step ‘take-aways’ providing
rotated to a new table. Organized by
effort. attendees with the ability to go back
Matt Heusser, this session proved
Offering such a broad range of and implement their new-found
to be an excellent way to get a
topics did not come without a price. knowledge on their own projects.
“trailer” version of each talk
While each topic provided interesting Even with a few minor missteps,
to discover whether or not
information, it is arguable whether all of this track remained strong,
you wanted more. One
them belonged in the “Test Automation” bolstered especially by topics
minor issue, however, was
track. For example, Hoffman’s “A Model from Pugh and Hayes, and
with the noise level in the
for Software Test Execution”, while very supported by Rollison and the
room. While it exemplified the
informative and well delivered, did not ever brilliant and experienced
great energy that was in the
offer much substantive test automation Hoffman.
room, it made the softer
information. In the “Test Management” spoken presenters
track, this presentation would’ve been Conference almost seem as
a winner, because it provided great Extras Retired NASA though they were
general testing information, which may In addition to Astronaut, Colonel on “mute”. One
indirectly bolster a test automation the training and Mike Mullane other minor
effort. In the “Test Automation” track, track sessions, the issue is that
however, it fell a little flat, because it STPCon offered the session
just didn’t tackle test automation in a additional elements took place the
direct enough manner. Also, some of the that served as icing morning of the
final day of the conference, so in some
instances it served as a post session
Speed summary as opposed to a pre-session
trailer. This should be at the beginning
Geeking of the core conference so that it can be
used by attendees to help them set their
agendas.
The second special element was
a keynote delivered by retired NASA
Astronaut, Colonel Mike Mullane.
Although it was a broader discussion
of QA and testing as opposed to one
that focused on test automation, we
felt it was worth mentioning because…
Continue on page 22
December 2009 www.automatedtestinginstitute.com Automated Software Testing Magazine 13
14 Automated Software Testing Magazine www.automatedtestinginstitute.com December 2009
December 2009 www.automatedtestinginstitute.com Automated Software Testing Magazine 15
Test Automation Map: An STP
Experience or re-experience the conference by walking through some of its sessions with ATI

Eric Pugh
Session Title: Put Your Automated Tests to
the Test With Continuous Integration

Memorable Concepts/Elements
• Without automated tests, CI doesn’t make
any sense
• Do manual testing based on risks
• Lava lamps were the first CI systems

Ross Collard

Session Title: Data Patterns for Performance &


Dan Downing Robustness Testing

Memorable concepts/Elements
• It’s often important to refrain from “cleaning-up”
data in order to keep the “messy richness” of the
data
• Moving “Pristine” production data to the test
environment
• See “At The STPCon with Dan Downing“ on pg 20
for more information.

Linda Hayes
Session Title: The High Cost of Manual Testing: Why
Automation is No Longer Optional

Memorable Concepts/Elements
• Sole reliance on manual testing is not a viable option
• dbase provides a case for the “real cost” of manual testing
• Buying a test tool is like joining a health club; the only
weight you’ve lost is in your wallet. You have to get up and
do the work if you want to lose real weight.

16 Automated Software Testing Magazine www.automatedtestinginstitute.com November 2009


PCon Sample Walkthrough
Bj Rollison
Session Title: GUI Automation - Steps for Success

Memorable Concepts/Elements
• People often gravitate to the user interface when coming
up with an automation strategy, but the UI automation is
often prone to fail at achieving the desired results

Douglas Hoffman
Session Title: Why Tests Don’t Pass (or Fail)

Memorable Concepts/Elements
• Try to automate things that are simple
• Recognize that we often only see part of the picture when
it comes to the system. We see the “gozintas” (things that
go into the system) and the “gozoutas” (things that go out
of the system)

Scott Barber
Session Title: What to Performance Test -- Choose Wisely

Memorable concepts/Elements
• Investigate or Validate - Determine if you’re trying to meet a
requirement, or provide extra information to development for
making the system faster
• When you go live with some “requirements” not being met,
that means they weren’t really ‘required’, they were ‘desired’.
• 60-80% of response time in production is due to front-end
processing. Hint for web: Load style sheets at the top and
scripts at the bottom

November 2009 www.automatedtestinginstitute.com Automated Software Testing Magazine 17


STPCon Numero ogy
The key numbers associated with STPCon ‘09

Percentage of tracks devoted


to test automation. Many test
conferences barely have 1 track
devoted to test automation, but
STPCon has 2 out of 5 (40%)
devoted to test automation
(“Performance Testing” track
and “Test Automation” track).

This is the number of training courses The number of track presentations


devoted to test automation devoted to test automation. There
(including performance were 32 total track presentations
in the entire conference, and
testing). There were 10 13 belonged to the “Test
total training sessions Automation” and “Performance
conducted during Testing” conference tracks.

the first 3 days of the


conference. 60%, or 6 of
10, were associated with
t e s t automation and performance
testing.

140+ is the number of


years of testing and test
automation experience The # of astronauts at
collectively held by
the 9 test automation/
the conference. The one
p e r f o r m a n c e and only retired NASA
presenters/trainers Astronaut, Colonel Mike
at the conference (the
8 core conference Mullane delivered an
presenters are listed enlightening keynote
on the STP 2009 address that was very
Automation Yearbook).
well received.
18 Automated Software Testing Magazine www.automatedtestinginstitute.com December 2009
www.googleautomation.com
December 2009 www.automatedtestinginstitute.com Automated Software Testing Magazine 19
At The STPCon with
Dan Downing
An ATI Interview by Dion Johnson

If you have a question about performance testing, Dion Johnson I just came out of a presentation
there’s a good chance that Dan Downing has the answer. that you gave on performance testing and data
With 28 years of technical and leadership experience patterns and it was a pretty good presentation. I thank
as a programmer, sales engineer, product manager, you for giving that. Would you like to talk a little bit
senior manager, and consultant, he’s a subject matter about performance testing and these data patterns?
expert in load testing and creator of the Five Steps of
Load Testing methodology.
Dan Downing Sure. I’d love to. I’m not
Downing delivered two talks at the Software Test exactly sure where to start. But anybody that’s
& Performance Conference this year, and after one involved with performance testing knows that data is
of them, I was able to catch up with him and ask him one of the biggest hurdles to get around. “What data
a few questions. Those questions and a portion of his am I going to use?” Typically we think of generating
responses are provided in this article.

20 Automated Software Testing Magazine www.automatedtestinginstitute.com December 2009


our own data. Sometimes that’s easy... But many times, Let’s try that and see if we get a SQL injection error
that’s not possible. There are complex systems where and get the web server to sort of cough up some
cultivating the data, or sort of generating on the fly information that we can use to do a security breach on
isn’t necessarily the easiest thing to do. [This was the that app. So the notion of exploratory testing brought
topic of the presentation I did] with Ross Collard. He’s into the performance testing realm - the idea is that
a lead thinker in data patterns. He’s published a dozen if you have the luxury of cultivating real data from a
articles or so in STP Magazine and other places over production system without messing with it, that is what
the last year or so... And we talked about live data; the we would call pristine data. [You don’t exclude certain
concept of live data being real data harvested from a data transactions because
production system. You they’re full of errors. or
09
STPCon 20
have the benefit of because they don’t occur
doing this, of course, os s C o ll a rd very often]. If you cultivate
D a n D o wning & R
when you have a data from a production
production system. If system it contains, by
it’s a brand new app, its inherent nature, the
it’s a little tougher to rich messiness of the
do that. So the notion real world. So people
is [that] live data is the doing expected things,
best place to go to. And entering expected
the notion of advanced values and entering
patterns of data is [that] unexpected values.
if you have the luxury of And when you can
starting with harvested use that kind of data in
data from production, a performance test, then the realism quotient that
how do you then enhance it? Or what are some of we strive for in performance testing is maximized. But
the mechanisms for ensuring that you’re truly covering keeping in mind that there’s this other set pattern or set
the rich messiness of the real world so that your data is way to take this data and use it or enhance it [so] you
fine tuned to match your testing practice. really always have to be very conscious of [the objective
of the test] and what do I need to do to this cultivated
data to enhance my test objective to ensure that I meet
Dion Johnson So you just mentioned the it. So that’s kind of what we talked about today.
“rich messiness” which was an interesting
concept that you talked about at the presentation; “rich
messiness “ and also “pristine data”. Could you talk a Dion Johnson One other question that I
little bit about what that is and the importance of this have is, I think during the presentation there was
“rich messiness”? the concept of spending about 20% of your time just
dealing with test data. Do you feel that’s a good amount
of time? Do you need more time? Do you think it should
Dan Downing Absolutely. If you talk to be less time? I didn’t really hear it [qualified] whether
some of the luminaries in functional testing, [for or not that was a good number, or the ideal number.
example] James Bach [who’s] here at the conference,
he’s promoted a notion of exploratory testing as being
way better and more effective at finding real bugs Dan Downing I think Ross kicked that notion
quicker than the more traditional [scripted] test cases off by asking the audience, “so how many of you
that you plod through. And one of the things that you out there spend about 20% of your time dealing with
encounter in exploratory testing is the idea that you’re the problem of data,” and only a couple of hands went
looking for the corner cases, the boundary conditions, up. And then somebody in the back of the room said,
the unexpected data inputs. “Nobody would ever type “Sometimes I wish I had more time,” because there’s
a string of 32 characters for a password.” But why not? recognition, particularly in performance testing, of the
Continue on page 22
December 2009 www.automatedtestinginstitute.com Automated Software Testing Magazine 21
At STP With Dan Downing STPCon Revealed: A Conference Review
(Continued from 21) (Continued from 13)
importance of data. And yet, when you’re dealing well… he was an astronaut! You have to admit, that’s pretty
with the realities of time constraints and deadlines, cool. Plus he told a really funny story about testing toilets that
and [limited] time before the go-live milestones were used in zero gravity conditions.
that you’re trying to achieve, and having enough
In summary, while we feel that greater attention to the “Test
react time to correct the problems that may surface,
Automation” track and increased emphasis on the “tools” portion
I think the audience was basically saying that
of the “Future Test” track would greatly improve this conference,
they’ve discovered they don’t have enough time to
we also feel that the conference is already great. With two full-
deal with the data, and that not too many people
fledge automation tracks, in-depth training, a host of automation
get to spend that much time really tuning the data
experts, and an astronaut, ATI ranks this conference as out of this
to test.
world.

22 Automated Software Testing Magazine www.automatedtestinginstitute.com December 2009


1st Annual

ATI Automation
Honors
Celebrating Excellence in the
Discipline of Software Test
Automation
The 1st Annual Test Automation Honors celebrate test automation
excellence seen from automated test tool developers, authors and more.
This celebration specifically pays tribute to:
• Those that have displayed leadership in moving and keeping test
automation in its proper place as a distinct IT discipline,
• Those that drive innovation within the field, and
• Those that display excellence in automation implementation, thus
playing a big role in the delivery of a high quality product to customers
and/or production.
The finalists and awardees were nominated and voted on by you and
your peers, so now you can read to find out who the winners are and
what they did to garner support from the test automation community.

December 2009 www.automatedtestinginstitute.com Automated Software Testing Magazine 23


Runner Up
cfix http://www.cfix-testing.org

cifx unit tests are compiled and Debuggers (Visual Studio, WinDBG).
linked into a DLL. The testrunner
Producer/Project Admin
application provided by cfix allows
selectively running tests of one or more Johannes Passing
of such test-DLLs. Execution and
Last Eligible Version
behaviour in case of failing testcases
can be highly customized. Moreover, 1.3.0
cfix has been designed to work well
in conjunction with the Windows Fun Fact
Description 3,857 SourceForge downloads to-date
cfix is an xUnit testing framework Award Acceptance Excerpt
for C/C++, specialized for unmanaged
Windows development (32/64 bit). cfix “I am pleased to hear that cfix finished
supports development of both user and
Johannes Passing 2nd.”
kernel mode unit tests, and is hosted on
SourceForge.
cfix Representative Johannes Passing

Finalist Finalist
http://jwebunit.sourceforge.net

Producer/Project Admin
Julien Henry

Last Eligible Version


2.1
http://www.nunit.org/index.php

Finalist http://www.lastcraft.com/simple_test.php

Producer/Project Producer/Project
Admin Admin
Marcus Baker, Perrick Penet Charlie Poole

Last Eligible Version Last Eligible Version


1.0.1.eclipse_0.2.5 V2.5 Beta 2

24 Automated Software Testing Magazine www.automatedtestinginstitute.com December 2009


JUnit
Winner http://junit.org/

Description
JUnit is an open source unit testing
framework for the Java programming
language. Created by Kent Beck and Erich
Gamma, JUnit has been important in the
creation of test-driven development,
and is one framework in a family of unit
testing frameworks collectively known as
xUnit that originated with SUnit.
JUnit features include:
• Assertions for testing expected
results
• Test fixtures for sharing
common test data

• Test runners for running tests Producer/Project Admin


Currently this open source David Saff, Erich Gamma, Erik G. H.
software is released under IBM’s Meade, Kent Beck
Common Public License Version
0.5 and is hosted on SourceForge. Last Eligible Version
JUnit has been ported to other 4.5
languages including Ada (AUnit),
Fun Fact
PHP (PHPUnit), C# (NUnit),
Python (PyUnit), Fortran (fUnit), JUnit has also won the Best Java
Delphi (DUnit), Free Pascal Performance Monitoring/Testing Tool
(FPCUnit), Perl (Test::Class and award in the 2001 and 2002 JavaWorld
Test::Unit), C++ (CPPUnit), and Editor’s Choice Awards (ECA)
JavaScript (JSUnit).
Figure 1: JUnit & Eclipse IDE
December 2009 www.automatedtestinginstitute.com Automated Software Testing Magazine 25
Runner Up
JSystem http://www.jsystemtest.org/

including JUnit for writing tests, Ant Producer/Project Admin


used to execute scenarios, and Eclipse
Ignis Software
as the development environment.
Jsystem is comprised of the Last Eligible Version
following components: JSystem Services 5.2.5
(Java API), JSystem Drivers, JSystem
GUI Interface (JRunner), JSystem Agent Fun Fact
and the JSystem Eclipse plug-in It started as open source, became a closed
Description technology, is now open source again
JSystem is a professional open- Award Acceptance Excerpt
source tool for writing and managing
automated system tests. “We thank you for your appreciation of
Written in Java, JSystem is based Yoram Shamir our efforts and our vision”
on several open source java projects, CEO Ignis Software Yoram Shamir
Finalist
Finalist
AutoIt
http://www.autoitscript.com/

Producer/Project Admin
Jonathan Bennett

Last Eligible Version


3.3.0.0

Finalist http://funfx.rubyforge.org/
Watir
http://watir.com/
Producer/Project
FunFX

Admin Producer/Project
Aslak Hellesøy, Stefan Magnus Admin
Landrø, Peter Nicolai Motzfeldt Bret Pettichord

Last Eligible Version Last Eligible Version


0.2.2 1.6.0

26 Automated Software Testing Magazine www.automatedtestinginstitute.com December 2009


Selenium
Winner http://seleniumhq.org/

Description

Selenium
is a suite of tools used to test web
applications. This suite includes:
• Selenium IDE
• Selenium RC
• Selenium Grid
Selenium provides a record/playback
feature along with a test domain specific
language (DSL) to write tests in a number

of popular programming languages,


including Java, Ruby, Groovy, Python, Producer/Project Admin
PHP, and Perl. Test playback is in
most modern web browsers. Selenium OpenQA
deploys on Windows, Linux, and
Macintosh platforms. Selenium
Last Eligible Version
was originally developed by Jason 0.8
Huggins, who was later joined by
Fun Fact
other programmers and testers at
ThoughtWorks. It is open source People often think of Selenium as one
software, released under the Apache tool, but it is actually a suite of tools.
2.0 license and can be downloaded and There are three main tools in the suite,
used without charge. but in total, there are 8!
Figure 2: How Selenium Works
December 2009 www.automatedtestinginstitute.com Automated Software Testing Magazine 27
Runner Up
SoapUI http://www.soapui.org/

Sourceforge during September Producer/Project Admin


2005 and is distributed under
eviware
the terms of the GNU Lesser
General Public License with Last Eligible Version
Description the application and source code 2.5.1
soapUI is an open source Web also provided on the basis of the Open
Service Testing Tool for Service Source GNU Lesser General Public Fun Fact
Oriented Architecture (SOA). Its License. With more than 950,000 downloads, it is
functionality mainly covers web service the most used tool for SOA testing.
inspection, invoking, development,
simulation and mocking, functional
Award Acceptance Excerpt
testing, load and compliance testing. “We’re honored to be on that list... The
Productivity enhancement features can Ole Matzura &
be found in the soapUI pro version. Niclas Reimertz
other tools were really, really cool and we
soapUI was initially released on
SoapUI Representatives use them everyday.” - Ole & Niclas

Finalist Finalist
OpenSTA
http://www.opensta.org/

Producer/Project
Admin
Daniel Sutcliffe

The Grinder
Finalist WebLOAD
http://grinder.sourceforge.net/
http://www.webload.org/
Producer/Project Admin
Producer/Project
Philip Aston
Admin
Ram, Yam Shal-Bar Last Eligible Version
3.2

28 Automated Software Testing Magazine www.automatedtestinginstitute.com December 2009


JMeter
Winner http://jakarta.apache.org/jmeter/

Description

Apache JMeter
is an open source, 100% pure Java
desktop application designed to load
test functional behavior and measure
performance. It was originally designed
resources (files, Servlets, Perl scripts, Producer/Project Admin
for testing Web Applications but has
Java Objects, Data Bases and Queries,
since expanded to other test functions. Apache Software Foundation
FTP Servers and more). It can be used
Apache JMeter may be used to test to simulate a heavy load on a server,
Last Eligible Version
performance both on static and dynamic network or object to test its strength
or to analyze overall performance 2.3.2
under different load types. You can
Fun Fact
use it to make a graphical analysis
of performance or to test your • JMeter uses JUnit (this year’s ATI
server/script/object behavior Automation Honors winner of the
under heavy concurrent load. Best Open Source Unit Test Tool)
for testing its code. We guess birds
JMeter can load and
of a feather flock together.
performance test many different
server types including: • Stefano Mazzocchi of the Apache
• Web - HTTP, HTTPS Software Foundation was the
original developer of JMeter.
• SOAP
He wrote it primarily to test the
• Database via JDBC performance of Apache JServ (a
• LDAP project that has since been replaced
by the Apache Tomcat project).
• JMS
JMeter was eventually redesigned
• Mail - POP3(S) and IMAP(S) to enhance the GUI and to add
functional testing capabilities.
Figure 3: Apache JMeter Book
December 2009 www.automatedtestinginstitute.com Automated Software Testing Magazine 29
30 Automated Software Testing Magazine www.automatedtestinginstitute.com December 2009
December 2009 www.automatedtestinginstitute.com Automated Software Testing Magazine 31
Runner Up
Certify http://www.worksoft.com/

Certify eliminates custom coding Producer/Project Admin


and programming, a requirement of Worksoft
most test automation products, making it
fast and easy to implement and maintain. Last Eligible Version
Using an object-driven approach rather 8.2
than generating code, Certify validates
business process workflows using a data Fun Fact
model of fields, screens, and transactions. September 30, 2009, Worksoft announced
that it has been positioned in the Visionaries
quadrant by analyst firm Gartner, Inc.
Description Award Acceptance Excerpt
Worksoft Certify is an automated
functional testing solution for SAP
“This validates our focus on lifecycle
lifecycle management and cross-platform Greg Davoll
VP Marketing, Worksoft
management and test automation.”
business process validation. Greg Davoll

Finalist
Functional Tester
Finalist http://www-01.ibm.com/software/
awdtools/tester/functional/index.html
TestComplete

Producer/Project Admin

Finalist P roducer /
Project Admin
IBM

Last Eligible Version


AutomatedQA 8.0

Last Eligible Version


6.52

http://www.automatedqa.com/
products/testcomplete/

32 Automated Software Testing Magazine www.automatedtestinginstitute.com December 2009


QTP
Winner https://h10078.www1.hp.com/cda/hpms/display/main/hpms_
content.jsp?zn=bto&cp=1-11-15-24^1322_4000_100__

Description
Quicktest
Professional
delivers a complete automated testing
solution for functional, graphical user
interface, and regression testing that
helps you reduce the risks of application
failures. This software testing solution
enables your QA team to identify and
correct defects across a wide breath
of application environments, data sets
and business processes. Also called HP
Functional Testing, QTP also includes
several add-ins.
This tool allows you to conduct
both manual and automated testing for
both Graphical User Interface (GUI) –
based applications and non GUI-based
services.
Producer/Project Admin

Hewlett Packard (HP)

Last Eligible Version


10.0

Fun Fact

QTP supports a visual and a coding


approach with the keyword view and
expert view.
Figure 4: QTP, QC and Business Process Testing
December 2009 www.automatedtestinginstitute.com Automated Software Testing Magazine 33
Finalist
Performance Tester

http://www-01.ibm.com/software/
awdtools/tester/performance/index.
html

Producer/Project Admin
IBM

Last Eligible Version


8.0

Finalist
AQTime

http://www.automatedqa.com/
products/testcomplete/

Producer/Project Admin
AutomatedQA

Last Eligible Version


6.11

34 Automated Software Testing Magazine www.automatedtestinginstitute.com December 2009


Loadrunner
Winner https://h10078.www1.hp.com/cda/hpms/display/main/hpms_
content.jsp?zn=bto&cp=1-11-15-17^8_4000_100__

Description
Hewlett-Packard(HP)
Loadrunner (LR)
is a performance and load testing product
for examining system behaviour and
performance, while generating actual
system load.
LoadRunner can emulate hundreds
or thousands of concurrent users to
put the application through the rigors
of real-life user loads, while collecting
information from key infrastructure
components including: Web servers,
database servers etc. The results can prevent costly application performance Producer/Project Admin
then be analysed in detail, to explore the problems in production by detecting Hewlett Packard (HP)
reasons for particular behaviour. bottlenecks before a new system or
HP LoadRunner can help you upgrade is deployed. Last Eligible Version
9.5
This software enables you to
measure end-to-end performance, Fun Fact
diagnose application and system HP LR holds 77% load testing
bottlenecks and tune for better marketshare worldwide!
performance—all from a single
point of control. The integrated
load test, performance test and
application stress test features
help you reduce the costs and time
required to test and deploy new
applications and systems into your
production environment.
Figure 5: LR Response Time Graph
December 2009 www.automatedtestinginstitute.com Automated Software Testing Magazine 35
For more information, visit http://www.networking.automatedtestinginstitute.com
36 Automated Software Testing Magazine www.automatedtestinginstitute.com December 2009
www.googleautomation.com
December 2009 www.automatedtestinginstitute.com Automated Software Testing Magazine 37
Runner Up
Web Security Testing Cookbook
http://websecuritytesting.com/

Description Authors
The recipes in the Web Security Paco Hope, Ben Walther
Testing Cookbook demonstrate how Publisher
developers and testers can check for the
O’Reilly Media
most common web security issues, while
conducting unit tests, regression tests, or Book Excerpt
exploratory tests. In security testing, we consider the entire
set of unacceptable inputs - infinity - and
focus on the subset of those inputs that
are likely to create significant failure...
Award Acceptance Excerpt
“We tried to write a series of recipes that
Paco Hope
Author
were accessible and automatable”
Paco Hope
Finalist
Finalist Building a GUI Test
Automation Framework
Using Data Model
The Art of
Application
Performance Testing Authors
Izzat Alsmadi
Publisher
VDM Verlag
Authors
Ian Molyneaux

Publisher
O’Reilly Media, Inc.

38 Automated Software Testing Magazine www.automatedtestinginstitute.com December 2009


IAST
Winner
Description
Implementing
A u t o m a t e d
Software Testing
(IAST) is the winner. Testing accounts
for an increasingly large percentage
of the time and cost of new software
development. Using automated software
testing (AST), developers and software
testers can optimize the software
testing lifecycle and thus reduce cost.
As technologies and development grow
increasingly complex, AST becomes even
more indispensable. This book builds on
some of the proven practices and the
automated testing lifecycle methodology crucial success factors, and key pitfalls Publisher
(ATLM) described in Automated along with solutions for avoiding them.
Addison Wesley
Software Testing and provides a Authors
renewed practical, start-to-finish guide to Book Excerpt
implementing AST successfully. Elfriede Dustin, Thom Garrett, Bernie It is possible to develop software that
Gauf converts any type of manual software
In Implementing Automated
Software Testing, three leading experts test existing today into an automated test.
explain AST in detail, systematically Award Acceptance Excerpt
reviewing its components, capabilities,
and limitations. Drawing on their
“At IDT our whole focus is automated
experience deploying AST in both defense
software testing, so we’re delighted the work
and commercial industry, they walk that we’ve been able to put together is reaching
you through the entire implementation Bernie Gauf those that are actually doing the testing”
Author
process—identifying best practices,
Bernie Gauf
December 2009 www.automatedtestinginstitute.com Automated Software Testing Magazine 39
Finalist
Autonomicon

http://autonomicon.blogspot.com/

Producer/Primary Blogger
Nick Olivio

Blogger Profile
I’m an SE, an automated tester and a
tech writer rolled into one. You can
follow me on twitter, my account name
is nickolivo.

Finalist
Corey Goldberg
Blog

http://coreygoldberg.blogspot.com/

Producer/Primary Blogger
Corey Goldberg

Blogger Profile
• Age: 35
• Gender: Male
• Industry: Technology
• Occupation: Software Engineer
• Location: Boston : MA : United
States

40 Automated Software Testing Magazine www.automatedtestinginstitute.com December 2009


Google
Testing
Winner http://googletesting.blogspot.com/
Blog

Description
Google Testing
Blog is yet another step made
by Google in the direction of world
domination.
This blog is just one in a family of
blogs offered by the company; a family
that includes, but is not limited to:
• The Official Google Blog
• Online Security Blog • Whittaker - Posts written by Producer/Primary Blogger
James A. Whittaker, a testing
• Open Source Blog Google
practisioner.
Unlike the other’s, however, The
• Conferences - Posts related Blogger Profile
Google Testing Blog focuses strictly
to various testing conferences
on software testing. Although this blog Google Inc. is an American public
(usually the conferences Google
focuses on software testing as a whole, corporation specializing in Internet
conducts)
it offers a significant number of posts search.
• TotT - TotT stands for “Testing
geared towards software test automation,
on the Toilet”. That’s right, Fun Facts
which explains why the automated testing
“Testing on the Toilet”. We’re
community nominated and voted it the • Google has a flyer/blog series
not even going to try to explain
Best Automated Testing Blog of the year. called “Testing on the Toilet”! It’s
it here. See the Fun Facts header
designed to encourage developers
This offers content on a wide range for a link that explains what this
to write more tests for their
of topics, but groups many of its posts in is.
code. For more information visit
the following categories: • Misko - Posts written by Miško
http://g oogletesting.blogspot.
Hevery, a testing practisioner.
• GTAC - Posts related to the com/2007/01/introducing-testing-
Google Test Automation • chromeos - Posts relative to the on-toilet.html.
Conference (GTAC) Chrome Operating System (OS)

December 2009 www.automatedtestinginstitute.com Automated Software Testing Magazine 41


Finalist

TDForums

http://tdforums.com

Producer/Admin
Eric Schumacher

Target Audience
Users of HP Mercury products

Finalist

Tek-tips

http://www.tek-tips.com/

Producer/Admin
Tecumseh Group

Target Audience
Verious technical software/hardware
resources

42 Automated Software Testing Magazine www.automatedtestinginstitute.com December 2009


SQAForums
Winner http://www.sqaforums.com

Description
SQAForums is an
internet forum site that is created by
BetaSoft, Inc.

BetaSoft Inc. is a privately held


corporation based in San Jose,
California. Founded in 1995, its main
business function is providing software
testing and quality assurance services
throughout the United States and testing forums as: • QuickTest Professional
Canada. • Unit Testing • TestComplete
BetaSoft’s areas of expertise are in • Automated Testing • OpenSTA
performance and automated testing,
• Performance & Load Testing • LoadRunner
QA management, test planning and
process improvement and they also resell • Security Testing Producer/Admin
automated and performance test tools It also features such tool specific forums BetaSoft Inc.
and solutions. (some of which also happen to be
winners and finalists in this year’s ATI Target Audience
Since 1999 and while growing
as a consulting company, BetaSoft Automation Honors) as: QA Practitioners
developed QAForums (SQAForums),
an online center for software testing Award Acceptance Excerpt
and quality assurance resources, and
soon added additional portals (QALinks,
“It’s wonderful to see that some organization
QADownloads, QAJobs, QANews, has awards focused on software testing
QATraining.net. categories”
SQAForums features such software AJ Alhait
Founder
AJ Alhait

December 2009 www.automatedtestinginstitute.com Automated Software Testing Magazine 43


44 Automated Software Testing Magazine www.automatedtestinginstitute.com December 2009

Você também pode gostar