Você está na página 1de 744

SilkCentral Test Manager 2009

Help

8310 N. Capital of Texas Hwy Bldg 2, Suite 100 Austin, TX 78731 USA www.borland.com Borland Software Corporation may have patents and/or pending patent applications covering subject matter in this document. Please refer to the product CD or the About dialog box for the list of applicable patents. The furnishing of this document does not give you any license to these patents. Copyright 2009 Borland Software Corporation and/or its subsidiaries. All Borland brand and product names are trademarks or registered trademarks of Borland Software Corporation in the United States and other countries. All other marks are the property of their respective owners. June 2009 PDF

Getting Started Concepts .................................................................................................................................................... Whats New in Borland SilkCentral Test Manager 2009 ..................................................................... Tour of the UI ...................................................................................................................................... Help on Help ....................................................................................................................................... Introduction to SilkCentral Test Manager .......................................................................................... Welcome to SilkCentral Test Manager ........................................................................................ Installing and Licensing Test Manager ........................................................................................ SilkCentral Issue Manager .......................................................................................................... Working With SilkPerformer Projects ........................................................................................... Working with Silk Performance Explorer ..................................................................................... Software Quality Optimization ..................................................................................................... SilkCentral Administration Module ............................................................................................... SilkCentral Architecture ........................................................................................................ Access and Licensing ........................................................................................................... Procedures ................................................................................................................................................. Configuring Browser Settings ............................................................................................................. Logging In and Out of Test Manager ................................................................................................. Logging into Test Manager .......................................................................................................... Logging out of Test Manager ....................................................................................................... Quick Start Tasks ....................................................................................................................................... Glossary ..................................................................................................................................................... 19 20 24 27 28 29 30 31 32 33 34 35 36 39 40 41 42 43 44 45 46

Concepts Successful Test Management .................................................................................................................... 51 Settings Configuration ........................................................................................................................ 52 Global Filters ................................................................................................................................ 53 Attributes ...................................................................................................................................... 54 Custom Requirement Properties ................................................................................................. 55 Custom Step Properties .............................................................................................................. 56 Change Notification ..................................................................................................................... 57 Requirements Integration Configuration ...................................................................................... 58 Data Sources for Data-Driven Tests ............................................................................................ 60 Issue Tracking Profiles ................................................................................................................ 61 Source Control Profiles ................................................................................................................ 63 Requirements Management ............................................................................................................... 66 Requirements Tree ...................................................................................................................... 67 Attachments ................................................................................................................................. 68 Full Coverage and Direct Coverage Modes ................................................................................ 69 Test Coverage Status .................................................................................................................. 70 Requirements Reports ................................................................................................................. 71 Microsoft Office Requirement-Import Tool ................................................................................... 72 Test Plan Generation ................................................................................................................... 73 Requirement History .................................................................................................................... 74 Change-Notification Emails ......................................................................................................... 75 External Requirements Management Tools ................................................................................ 77 External Requirements Management Tools ......................................................................... 78 Synchronizing Requirements ............................................................................................... 79 CaliberRM Integration with Test Manager ........................................................................... 81 Baseline Support for CaliberRM Integration .................................................................. 82 Test Definition Assignment Handling ............................................................................ 83 Filtering ............................................................................................................................................... 84 Filters ........................................................................................................................................... 85 Recent Changes .......................................................................................................................... 86 Test Plan Management ....................................................................................................................... 87 Test Plan Management ............................................................................................................... 88 Test Plan Tree ............................................................................................................................. 89 Test Plan Reports ........................................................................................................................ 90 Data-Driven Tests ........................................................................................................................ 91 Success Conditions ..................................................................................................................... 93 Test Definition Parameters .......................................................................................................... 94 Test Packages ............................................................................................................................. 96 Usage of External IDs .................................................................................................................. 97 Manual Tests ............................................................................................................................... 99 Converting Manual Tests to Automated Tests ................................................................... 100 Using External Tools to Create Manual Tests .................................................................... 101 Test Definitions in the Manual Testing Client ..................................................................... 102 SilkTest Test Plans .................................................................................................................... 104 SilkTest Test Definitions ..................................................................................................... 105 Test Definitions .......................................................................................................................... 107 Upload Manager ................................................................................................................. 108 Windows Script Host Tests ................................................................................................. 110 Test Definition Execution .................................................................................................................. 116 VMware Lab Manager Integration ............................................................................................. 117 VMware Lab Manager Virtual Configurations ..................................................................... 118 Execution Dependency Configuration ....................................................................................... 119 Execution Definitions ................................................................................................................. 120 4

Execution Definition Run Results Dialog ................................................................................... Execution Definition Schedules ................................................................................................. Setup and Cleanup Test Definitions .......................................................................................... Calculating the Test Definition Status ........................................................................................ Manual Test Definitions ............................................................................................................. Manual Test Execution ....................................................................................................... Tour of the Manual Testing Client UI .................................................................................. Manual Testing Client ......................................................................................................... SilkTest Tests ............................................................................................................................ SilkTest Logs ...................................................................................................................... SilkTest Time-out Settings .................................................................................................. Automated Execution of Data-Driven SilkTest Testcases .................................................. Automated Execution of SilkTest Test Definitions .............................................................. Specifying Agent Under Test (AUT) ................................................................................... Issue Management ........................................................................................................................... Project Management ......................................................................................................................... Build Information ........................................................................................................................ Build Information Updates ......................................................................................................... Report Generation ............................................................................................................................ New Report Creation ................................................................................................................. New Reports ....................................................................................................................... SQL Functions for Custom Reports ................................................................................... Context-Sensitive Reports ......................................................................................................... Project Overview Report ............................................................................................................ Test Manager 8.0 Reports ......................................................................................................... Requirements Reports ............................................................................................................... Status Reports .................................................................................................................... Progress Reports ................................................................................................................ Document Reports ............................................................................................................. All Related Issues Report ................................................................................................... Test Plan Reports ...................................................................................................................... Status Reports .................................................................................................................... Progress Reports ................................................................................................................ Manual Test Reports .......................................................................................................... Execution Reports ..................................................................................................................... Run Comparison Reports ................................................................................................... Execution Definition Run Comparison Reports .................................................................. Test Definition Run Comparison Report ............................................................................. Execution Definition Run Errors Report ............................................................................. Code Coverage Reports ............................................................................................................ Code Coverage Trend Report ............................................................................................ Method Coverage Comparison Report ............................................................................... Performance Trend Reports ...................................................................................................... Average Page-Time Trend Report ..................................................................................... Average Transaction Busy-Time Trend Report .................................................................. Custom Measure Trend Report .......................................................................................... Overall Page-Time Trend Report ....................................................................................... Overall Transaction Busy-Time Trend Report .................................................................... Issues Per Component Report .................................................................................................. Code-Change Impact Reports ................................................................................................... Code Coverage Analysis .................................................................................................................. Test Manager Code Analysis .................................................................................................... Enabling Code Analysis for SilkCentral Test Manager .............................................................. Latest Builds and Build Versions ............................................................................................... Results Compilation ...................................................................................................................

123 124 125 126 127 128 129 135 136 137 138 139 140 141 142 144 145 146 147 148 149 151 153 154 155 156 157 158 159 160 162 163 164 165 166 167 168 170 172 174 175 176 177 178 179 181 183 184 185 186 188 189 190 192 193

Code Analysis and the Manual Testing Client ........................................................................... 194

Procedures Quick Start Tasks ..................................................................................................................................... Analyzing Test Results - Quick Start Task ....................................................................................... Creating New Reports ............................................................................................................... Editing Report Properties .......................................................................................................... Editing Report Parameters ........................................................................................................ Writing Advanced Queries with SQL ......................................................................................... Customizing BIRT Report Templates ........................................................................................ Adding Subreports ..................................................................................................................... Viewing Reports ........................................................................................................................ Displaying Charts ...................................................................................................................... Generating Code-Change Impact Reports ................................................................................ Configuring Projects - Quick Start Task ............................................................................................ Configuring Project Settings ...................................................................................................... Creating Custom Attributes ....................................................................................................... Creating Global Filters ............................................................................................................... Enabling Change Notification .................................................................................................... Creating Custom Step Properties .............................................................................................. Managing Requirements - Quick Start Task ..................................................................................... Creating Requirements .............................................................................................................. Configuring Requirement Types ................................................................................................ Attaching a File to a Requirement ............................................................................................. Creating Filters .......................................................................................................................... Creating Advanced Filters ......................................................................................................... Generating Test Plans from Requirements View ...................................................................... Managing Test Executions - Quick Start Task .................................................................................. Adding Execution Definitions ..................................................................................................... Manually Assigning Test Definitions to Execution Definitions ................................................... Assign Test Definitions from Grid View to Execution Definitions ............................................... Using a Filter to Assign Test Definitions to Execution Definitions ............................................. Creating a Custom Schedule for an Execution Definition .......................................................... Configuring Setup and Cleanup Executions .............................................................................. Adding Dependent Execution Definitions .................................................................................. Assigning Keywords to Execution Definitions ............................................................................ Executing Individual Tests ......................................................................................................... Viewing Test Execution Details ................................................................................................. Managing Test Plans - Quick Start Task .......................................................................................... Creating Test Definitions ........................................................................................................... Editing Test Definitions .............................................................................................................. Creating a Test Package ........................................................................................................... Creating Data-Driven Test Definitions ....................................................................................... Assigning Attributes to Test Definitions ..................................................................................... Adding Predefined Parameters to Test Definitions .................................................................... Creating Filters .......................................................................................................................... Assigning Requirements to Test Definitions .............................................................................. Attaching Files to Test Plan Elements ....................................................................................... Managing a Successful Test .................................................................................................................... Configuring Test Manager Settings .................................................................................................. Configuring Change Notification ................................................................................................ Disabling Change Notification ............................................................................................ Enabling Change Notification ............................................................................................. Configuring Custom Attributes ................................................................................................... Creating Custom Attributes ................................................................................................ Deleting Custom Attributes ................................................................................................. 7 197 198 199 201 202 203 204 205 206 207 208 209 210 211 212 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 232 234 235 236 237 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254

Editing Custom Attributes ................................................................................................... Configuring Custom Step Properties ......................................................................................... Creating Custom Step Properties ....................................................................................... Deleting Custom Step Properties ....................................................................................... Editing Custom Step Properties ......................................................................................... Configuring Data Sources for Data-Driven Tests ...................................................................... Configuring JDBC Data Sources ........................................................................................ Configuring Microsoft Excel or CSV Data Sources ............................................................ Deleting Data Sources ....................................................................................................... Downloading Excel Files from a Data Source .................................................................... Synchronizing Data Sources .............................................................................................. Uploading Updated Excel Files to a Data Source .............................................................. Configuring Global Filters .......................................................................................................... Creating Global Filters ........................................................................................................ Deleting Global Filters ........................................................................................................ Editing Global Filters .......................................................................................................... Configuring Issue Tracking Profiles ........................................................................................... Deleting Issue Tracking Profiles ......................................................................................... Managing SilkCentral Issue Manager Issue Tracking Profiles ........................................... Adding SilkCentral Issue Manager Issue Tracking Profiles ........................................ Mapping Issue States .................................................................................................. Editing SilkCentral Issue Manager Issue Tracking Profiles ......................................... Deleting Issue Tracking Profiles .................................................................................. Managing Borland StarTeam Issue Tracking Profiles ........................................................ Adding Borland StarTeam Issue Tracking Profiles ..................................................... Mapping Issue States .................................................................................................. Editing Borland StarTeam Issue Tracking Profiles ...................................................... Deleting Issue Tracking Profiles .................................................................................. Managing Bugzilla Issue Tracking Profiles ......................................................................... Adding Bugzilla Issue Tracking Profiles ...................................................................... Mapping Issue States .................................................................................................. Editing Bugzilla Issue Tracking Profiles ...................................................................... Deleting Issue Tracking Profiles .................................................................................. Managing IBM Rational ClearQuest Issue Tracking Profiles ........................................ Adding IBM Rational ClearQuest Issue Tracking Profiles ........................................... Mapping Issue States .................................................................................................. Editing IBM Rational ClearQuest Issue Tracking Profiles ........................................... Deleting Issue Tracking Profiles .................................................................................. Configuring Source Control Profiles .......................................................................................... Deleting Source Control Profiles ........................................................................................ Managing Borland StarTeam Source Control Profiles ....................................................... Adding StarTeam Source Control Profiles .................................................................. Editing StarTeam Source Control Profiles .................................................................. Deleting Source Control Profiles ................................................................................. Managing Serena Version Manager (PVCS) Profiles ........................................................ Adding PVCS Source Control Profiles ........................................................................ Editing PVCS Source Control Profiles ........................................................................ Deleting Source Control Profiles ................................................................................. Managing CVS Profiles ...................................................................................................... Adding CVS Source Control Profiles ........................................................................... Editing CVS Source Control Profiles ........................................................................... Deleting Source Control Profiles ................................................................................. Managing Microsoft Visual SourceSafe (MSVSS) Profiles ................................................ Adding MSVSS Source Control Profiles ..................................................................... Editing MSVSS Source Control Profiles ......................................................................

255 256 257 258 259 260 261 263 265 266 267 268 269 270 272 273 275 276 277 278 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 307 308 309 310 311 312 313 314 316

Deleting Source Control Profiles ................................................................................. Managing Subversion Profiles ............................................................................................ Adding Subversion Source Control Profiles ................................................................ Editing Subversion Source Control Profiles ................................................................ Deleting Source Control Profiles ................................................................................. Managing UNC Profiles ...................................................................................................... Adding UNC Source Control Profiles .......................................................................... Editing UNC Source Control Profiles .......................................................................... Deleting Source Control Profiles ................................................................................. Managing VFS Profiles ....................................................................................................... Adding VFS Source Control Profiles ........................................................................... Editing VFS Source Control Profiles ........................................................................... Deleting Source Control Profiles ................................................................................. Configuring Project Settings ...................................................................................................... Managing Requirements ................................................................................................................... Creating Requirements .............................................................................................................. Managing Requirement Attachments ................................................................................. Attaching a File to a Requirement ............................................................................... Attaching a Link to a Requirement .............................................................................. Deleting a Requirement Attachment ........................................................................... Editing a Requirement Attachment Description .......................................................... Viewing a Requirement Attachment ............................................................................ Configuring Requirement Types ......................................................................................... Creating Requirements ...................................................................................................... Assigning Test Definitions from Grid View to Requirements .............................................. Assigning Test Definitions to Requirements Manually ....................................................... Creating Child Requirements ............................................................................................. Editing Requirements ......................................................................................................... Finding Requirement Properties ......................................................................................... Generating Test Plans from Requirements View ............................................................... Locating Assigned Test Definitions in the Test Plan Tree .................................................. Marking Requirements as Obsolete ................................................................................... Removing Test Definition Assignments .............................................................................. Replacing Requirement Properties .................................................................................... Sorting the Assigned Test Definitions Tab ......................................................................... Tracking the History of a Requirement ............................................................................... Customizing Requirement Properties ........................................................................................ Configuring Custom Requirement Properties ..................................................................... Deleting Custom Requirement Properties .......................................................................... Editing Custom Requirement Properties ............................................................................ Integrating External RM Tools ................................................................................................... Enabling External Requirements Management Integration ................................................ Enabling Integration with Borland CaliberRM ............................................................. Enabling Integration with IBM Rational RequisitePro .................................................. Enabling Integration with Telelogic DOORS ............................................................... Working with CaliberRM ..................................................................................................... Copying CaliberRM-Integrated Projects ...................................................................... Working with External Properties ....................................................................................... Editing External Properties .......................................................................................... Viewing External Properties ........................................................................................ Deleting Property-Mapping Value Pairs ............................................................................. Disabling Requirements-Management Integration ............................................................. Editing Property Mapping ................................................................................................... Removing Requirements-Management Integration ............................................................ Synchronizing Requirements Across Tools ........................................................................

317 318 319 320 321 322 323 324 325 326 327 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 353 354 355 356 357 358 359 360 361 362 364 366 367 368 369 370 371 372 373 374 375

Collapsing or Expanding the Requirements tree ....................................................................... Switching Between Full and Direct Coverage Modes ................................................................ Managing Test Plans ........................................................................................................................ Associating Requirements with Test Definitions ........................................................................ Assigning Requirements to Test Definitions ....................................................................... Locating Assigned Requirements ....................................................................................... Removing Requirement Assignments ................................................................................ Sorting Requirements ......................................................................................................... Configuring Test Definition Attributes ........................................................................................ Assigning Attributes to Test Definitions .............................................................................. Deleting Attributes from Test Definitions ............................................................................ Editing Test Definition Attributes ........................................................................................ Configuring Test Definition Parameters ..................................................................................... Editing Predefined Parameters .......................................................................................... Adding Predefined Parameters to Test Definitions ............................................................ Clearing Predefined Parameter Assignments .................................................................... Configuring SilkTest Plan Properties .................................................................................. Configuring .Net Explorer Test Properties .......................................................................... Configuring JUnit Test Properties ...................................................................................... Configuring Manual Test Properties ................................................................................... Configuring NUnit Test Properties ...................................................................................... Configuring SilkPerformer Test Properties ......................................................................... Configuring SilkTest Test Properties .................................................................................. Configuring Windows Scripting Test Properties ................................................................. Creating Custom Parameters ............................................................................................. Creating Test Definitions ........................................................................................................... Creating a Test Package .................................................................................................... Creating Test Definitions .................................................................................................... Editing Test Definitions ....................................................................................................... Executing a Trial Run of a Test Definition .......................................................................... Creating Test Plans ................................................................................................................... Importing SilkTest Test Plans ............................................................................................. Editing Test Plan Elements ....................................................................................................... Adding Links to Containers ................................................................................................ Adding Test Containers ...................................................................................................... Adding Test Folders ........................................................................................................... Copying, Pasting, and Deleting Test Plan Elements .......................................................... Editing SilkTest Tests ......................................................................................................... Editing SilkPerformer Tests ................................................................................................ Editing JUnit Tests .............................................................................................................. Editing NUnit Tests ............................................................................................................. Editing Success Conditions ................................................................................................ Editing Windows Scripting Host Tests ................................................................................ Finding and Replacing Test Definition Properties .............................................................. Modifying Test Containers .................................................................................................. Modifying Test Folders ....................................................................................................... Set a Test Plan Node as Integration Default for External Agile Planning Tools ................. Working with Attachments ......................................................................................................... Deleting Attachments from Test Plan Elements ................................................................. Attaching Files to Test Plan Elements ................................................................................ Attaching Links to Test Plan Elements ............................................................................... Editing Attachment Descriptions ........................................................................................ Viewing Test Plan Attachments .......................................................................................... Working with Data-Driven Tests ................................................................................................ Adding a Data Source Value to a Manual Test Step ..........................................................

377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 397 398 399 400 401 402 403 404 405 407 408 409 410 411 412 413 415 416 418 419 420 421 422 423 424 427 428 429 430 431 432 433 434 435 436 437

10

Creating Data-Driven Test Definitions ................................................................................ Downloading CSV Data From a Data Source .................................................................... Editing Data-Driven Properties ........................................................................................... Working with Manual Tests ....................................................................................................... Converting Manual Test Definitions to Automated Tests ................................................... Editing Manual Test Steps From Within Test Manager ...................................................... Working With Test Definitions in Grid View ............................................................................... Creating an Execution Definition in Grid View .................................................................... Displaying/Hiding Columns in Grid View ............................................................................ Filtering Test Definitions in Grid View ................................................................................ Grouping Test Definitions in Grid View .............................................................................. Linking to Test Definitions from Grid View ......................................................................... Removing Grid View Filters ................................................................................................ Reordering Columns in Grid View ...................................................................................... Resizing Columns in Grid View .......................................................................................... Restoring Default Grid View Settings ................................................................................. Sorting Test Definitions in Grid View .................................................................................. Creating a Filter for a Folder or Container ................................................................................. Expanding/Collapsing the Test Plan tree .................................................................................. Tracking Test Plan History ......................................................................................................... Updating Execution Definitions .................................................................................................. Using Upload Manager .............................................................................................................. Viewing Assigned Executions .................................................................................................... Viewing Recent Changes .......................................................................................................... Executing Test Definitions ................................................................................................................ Analyzing Test Runs .................................................................................................................. Changing the Status of a Test Execution Run ................................................................... Deleting Individual Test Run Results ................................................................................. Deleting the Results of an Execution Definition ................................................................. Viewing Test Execution Details .......................................................................................... Assigning Test Definitions to Execution Definitions ................................................................... Locating Test Definitions Assigned to Execution Definitions .............................................. Removing Test Definition Assignments .............................................................................. Assign Test Definitions from Grid View to Execution Definitions ....................................... Creating an Execution Definition in Grid View .................................................................... Manually Assigning Test Definitions to Execution Definitions ............................................ Using a Filter to Assign Test Definitions to Execution Definitions ...................................... Configuring Deployment Environments ..................................................................................... Adding a SilkTest AUT Host ............................................................................................... Removing a Tester Assignment from an Execution Definition ........................................... Adding Manual Testers ...................................................................................................... Assigning Keywords to Execution Definitions .................................................................... Creating New Keywords ..................................................................................................... Removing Keywords from Execution Definitions ................................................................ Configuring Execution Dependencies ....................................................................................... Adding Dependent Execution Definitions ........................................................................... Deleting a Dependency ...................................................................................................... Editing a Dependency ........................................................................................................ Defining Execution Definition Schedules ................................................................................... Adding Definite Runs .......................................................................................................... Adding Exclusions .............................................................................................................. Creating a Custom Schedule for an Execution Definition .................................................. Deleting Definite Runs ........................................................................................................ Editing Definite Runs .......................................................................................................... Specifying Global Schedules for Execution Definitions ......................................................

438 439 440 441 442 443 444 445 446 447 449 450 451 452 453 454 455 456 457 458 459 460 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479 480 481 483 484 485 486 488 489 490 491 492 493 494 495 496

11

Specifying No Schedule for Execution Definitions ............................................................ Deleting Exclusions ............................................................................................................ Editing Exclusions .............................................................................................................. Executing Manual Tests ............................................................................................................ Using the Manual Testing Client ....................................................................................... Configuring the Manual Testing Client ....................................................................... Configuring Connection Parameters .................................................................... Configuring Other Settings ................................................................................... Configuring Package Upload Preferences ........................................................... Managing Attachments with the Manual Testing Client ............................................. Pasting Screen Captures ..................................................................................... Uploading Attachments to the Manual Testing Client .......................................... Viewing Attached Images Within the Manual Testing Client ................................ Viewing Attachments Within the Manual Testing Client ....................................... Adding an Internal Issue with the Manual Testing Client ............................................ Changing a Test Definitions Status ............................................................................ Downloading Execution Definition Packages .............................................................. Editing Package Build Numbers .................................................................................. Editing Test Definitions Within the Manual Testing Client ........................................... Enabling Code Analysis Within the Manual Testing Client ......................................... Executing Manual Tests with the Manual Testing Client ............................................. Exporting and Importing Execution Packages ............................................................ Installing SilkCentral Manual Testing Client ................................................................ Uploading Test Results to Test Manager .................................................................... Viewing and Editing Test Definitions in Test Manager ................................................ Working Offline with the Manual Testing Client .......................................................... Aborting Manual Test Executions ....................................................................................... Executing Manual Tests ..................................................................................................... Executing Manual Tests in the Current Run Page ............................................................. Running Automated Tests ......................................................................................................... Executing Individual Tests .................................................................................................. Working with Execution Definitions ........................................................................................... Adding Execution Definitions .............................................................................................. Copying Execution Definitions ............................................................................................ Deleting Execution Definitions ............................................................................................ Editing Execution Definitions .............................................................................................. Working with SilkPerformer Projects ......................................................................................... Analyzing SilkPerformer Test Results ................................................................................ Downloading SilkPerformer Test Result Packages ............................................................ Downloading SilkPerformer Projects .................................................................................. Editing SilkPerformer Test Properties ................................................................................ Executing Attended SilkPerformer Tests ............................................................................ Opening SilkPerformer Projects ......................................................................................... Uploading SilkPerformer Test Results ............................................................................... Collapsing or Expanding the Execution Tree ............................................................................ Configuring Setup and Cleanup Executions .............................................................................. Creating Data-Driven Execution Definitions .............................................................................. Managing Issues ............................................................................................................................... Tracking Issues .......................................................................................................................... Viewing Issue Statistics in Details View ............................................................................. Viewing Issue Statistics in Document View ........................................................................ Working with Issues ................................................................................................................... Assigning External Issues .................................................................................................. Creating New Issues .......................................................................................................... Deleting Issues ...................................................................................................................

497 498 499 500 501 502 503 504 505 506 507 508 509 510 511 512 513 514 515 517 518 520 521 523 524 525 526 527 529 530 531 532 533 534 535 536 537 538 539 540 541 542 543 544 545 546 547 549 550 551 552 553 554 555 557

12

Specifying a Calendar Range ............................................................................................. Synchronizing Internal/External Issue States ..................................................................... Managing Projects ............................................................................................................................ Managing Folders ...................................................................................................................... Copying Folders ................................................................................................................. Cutting Folders ................................................................................................................... Deleting Folders ................................................................................................................. Editing Folders ................................................................................................................... Pasting Folders .................................................................................................................. Pasting Folders as Child Folders ....................................................................................... Sorting Folders ................................................................................................................... Adding Folders ................................................................................................................... Creating Build Information Files ................................................................................................ Selecting Projects ...................................................................................................................... Managing Activities ........................................................................................................................... Deleting Last Executions Runs ................................................................................................ Displaying/Hiding Columns on the Activities Page .................................................................... Entering Issues From the Activities Tab .................................................................................... Filtering Test Runs on the Activities Page ................................................................................. Grouping Test Runs on the Activities Page ............................................................................... Removing Activities Filters ......................................................................................................... Reordering Columns on the Activities Page .............................................................................. Resizing Columns on the Activities Page .................................................................................. Restoring Default Activities Page View Settings ....................................................................... Sorting Test Runs on the Activities Page .................................................................................. Managing Reports ............................................................................................................................ Creating Reports ....................................................................................................................... Creating New Reports ........................................................................................................ Writing Advanced Queries with SQL .................................................................................. Customizing Reports with BIRT ................................................................................................. Customizing BIRT Report Templates ................................................................................. Downloading Report Templates ......................................................................................... Generating Reports ................................................................................................................... Using Context-Sensitive Reports ....................................................................................... Accessing Context-Sensitive Reports ......................................................................... Accessing Context-Sensitive Execution Reports ................................................. Accessing Context-Sensitive Requirements Reports .......................................... Accessing Context-Sensitive Test-Definition Reports .......................................... Enabling Context-Sensitive Reports ........................................................................... Enabling Context-Sensitive Execution Reports ................................................... Creating New Reports ................................................................................... Writing Advanced Queries with SQL ............................................................ Enabling Context-Sensitive Requirements Reports ............................................. Creating New Reports ................................................................................... Writing Advanced Queries with SQL ............................................................ Enabling Context-Sensitive Test-Plan Reports .................................................... Creating New Reports ................................................................................... Writing Advanced Queries with SQL ............................................................ Removing Report Templates .............................................................................................. Saving Reports ................................................................................................................... Uploading Report Templates .............................................................................................. Viewing a Report as a PDF ................................................................................................ Viewing Reports ................................................................................................................. Adding Subreports ..................................................................................................................... Deleting Subreports ...................................................................................................................

558 559 560 561 562 563 564 565 566 567 568 569 570 572 573 574 575 576 577 579 580 581 582 583 584 585 586 587 589 590 591 592 594 595 596 597 598 599 600 601 603 605 606 607 609 610 611 613 614 615 616 617 618 619 620

13

Displaying Charts ...................................................................................................................... Accessing MRU (Most Recently Used) Reports ........................................................................ Editing Report Parameters ........................................................................................................ Editing Report Properties .......................................................................................................... Printing Charts ........................................................................................................................... Removing Charts ....................................................................................................................... Working with Filters ........................................................................................................................... Applying Filters .......................................................................................................................... Creating Advanced Filters ......................................................................................................... Creating Filters .......................................................................................................................... Deleting Filters ........................................................................................................................... Editing Filters ............................................................................................................................. Analyzing Code Coverage ................................................................................................................ Enabling Code Analysis for Execution Definitions ..................................................................... Generating Code-Change Impact Reports ................................................................................ Viewing Code-Coverage Information for Packages ................................................................... Enabling Code Analysis Within the Manual Testing Client ........................................................

621 622 623 624 625 626 627 628 629 630 631 632 633 634 635 636 637

14

Reference User Interface Reference ......................................................................................................................... Projects Unit Interface ...................................................................................................................... Projects tab ................................................................................................................................ Overview tab .............................................................................................................................. Activities Page ........................................................................................................................... Cross-Project Activities Page .................................................................................................... Test Definition Run Results Dialog ............................................................................................ Settings Unit Interface ...................................................................................................................... Project Settings tab ................................................................................................................... Filters tab ................................................................................................................................... Attributes tab ............................................................................................................................. Requirement Properties Page ................................................................................................... Step Properties Page ................................................................................................................ Notifications Page ...................................................................................................................... Integrations Configuration tab ................................................................................................... Data Sources Configuration Page ............................................................................................. Issue Tracking Profiles Page ..................................................................................................... Source Control Profiles Page .................................................................................................... Requirements Unit Interface ............................................................................................................. Requirements Document View .................................................................................................. Requirements Toolbar Functions .............................................................................................. Requirement Properties tab ....................................................................................................... Requirement Attachments tab ................................................................................................... Assigned Test Definitions tab .................................................................................................... Requirement Coverage tab ....................................................................................................... Requirement History tab ............................................................................................................ Test Plan Unit Interface .................................................................................................................... Test Plan Document View ......................................................................................................... Test Plan Grid View ................................................................................................................... Test Plan Properties tab ............................................................................................................ Test Plan Steps Page ................................................................................................................ Test Plan Contents Tab ............................................................................................................. Test Plan Attributes tab ............................................................................................................. Test Plan Parameters tab .......................................................................................................... Test Plan Assigned Requirements tab ...................................................................................... Test Plan Attachments tab ........................................................................................................ Test Plan Assigned Executions tab ........................................................................................... Test Plan Runs tab .................................................................................................................... Test Plan Issues Page ............................................................................................................... Test Plan History tab ................................................................................................................. Test Plan Data Set tab .............................................................................................................. Test Plan Toolbar Functions ...................................................................................................... Test Definition Run Results Dialog ............................................................................................ Execution Unit Interface .................................................................................................................... Execution Document View ......................................................................................................... Execution Properties tab ........................................................................................................... Execution Assigned Test Definitions Tab .................................................................................. Execution Setup/Cleanup tab .................................................................................................... Execution Schedule tab ............................................................................................................. Execution Deployment tab ......................................................................................................... Execution Dependencies tab ..................................................................................................... Execution Notifications Page ..................................................................................................... Execution Runs Tab .................................................................................................................. 15 641 642 643 644 645 650 651 653 654 655 656 657 658 659 660 662 663 664 665 666 667 668 669 670 671 672 673 674 675 676 677 678 679 680 681 682 683 684 685 686 687 688 689 691 692 693 695 697 698 700 702 703 704

Current Run Page ...................................................................................................................... Run Dialog ................................................................................................................................. Execute Test Dialog Box ........................................................................................................... Test Definition Run Results Dialog ............................................................................................ Code Analysis Unit Interface ............................................................................................................ Code Analysis Details tab .......................................................................................................... Select Classes for Report Dialog ............................................................................................... Issues Unit Interface ......................................................................................................................... Issues Document View .............................................................................................................. Issues tab .................................................................................................................................. Calendar Tool ............................................................................................................................ Reports Unit Interface ....................................................................................................................... Report Properties tab ................................................................................................................ Report Parameters tab .............................................................................................................. Report Data tab ......................................................................................................................... Report Chart tab ........................................................................................................................ Report tab .................................................................................................................................. Reports Toolbar Functions ........................................................................................................ General Reference ................................................................................................................................... HTML Support for Description Text Boxes ....................................................................................... Multi-Select Functionality for Test Plan Elements ............................................................................ SQL Functions for Custom Reports .................................................................................................. APIs .......................................................................................................................................................... Database Schemas ..................................................................................................................................

706 709 710 713 715 716 718 719 720 721 722 723 724 726 727 728 729 730 731 732 734 736 738 739

16

17

Getting Started
This section explains the concepts and procedures related to getting started using Test Manager. In This Section Concepts This section explains the concepts that are required for getting started with Test Manager. Procedures This section explains the tasks that must be performed before you can begin using SilkCentral Test Manager. Quick Start Tasks Quick Start Tasks are high-level overviews of the main tasks that you will likely need to perform with SilkCentral Test Manager. Glossary Guide to Test Manager terminology.

18

Concepts
This section explains the concepts that are required for getting started with Test Manager. In This Section Whats New in Borland SilkCentral Test Manager 2009 New functionality and updates in SilkCentral Test Manager 2009. Tour of the UI An overview of the main elements of Test Manager's user interface. Help on Help How information is organized in Test Manager Help. Introduction to SilkCentral Test Manager Test Manager is a complete testing solution, from requirements and test-plan management, to test-execution management, code-coverage analysis, issue tracking, and reporting.

19

Whats New in Borland SilkCentral Test Manager 2009


SilkCentral Test Manager 2009 provides significant enhancements that support teams working in Agile software development environments. It also offers new features, enhancements, and changes related to general product maintenance.

Support of Agile Teams


The following enhancements support teams working in Agile software development environments.

Usability Enhancements
Usability enhancements for Agile teams have been made throughout SilkCentral Test Manager. New Manual Test Web UI (Current Run Page) The new Current Run page facilitates the administration and execution of manual tests by Agile teams in Test Manager's GUI. All information related to the current manual execution is consolidated onto a single page acting like a list of test tasks. This approach provides each Agile team member with an overview of the tests that are finished, in progress, and not yet executed. The former Manual Test Detail view, Step-by-Step view, and Step-by-Step Only view are consolidated onto a single page. Three separate grids on the page show detailed information related to the active execution definition run, the assigned test definitions, and the test steps. The new page enables you to easily keep test-management data updated with changing testing needs. For example, the page displays additionally assigned test definitions, changes to test definition specifications, and changes to test steps. Such changes can be entered directly in the Current Run page. For enhanced usability the Current Run page offers standard keyboard support for easy navigation and grid functionality for customizing and interacting with the displayed data. The Manual Testing Client is still available and remains the best choice for traditional manual testing for testers working with assigned packages of manual test definitions. Note: For automated tests, the Current Run page shows the progress of executions. Seamless Automation of Manual Test Definitions Going Agile typically involves increased test automation. Test Manager supports automation efforts with seamless transformation of manual test definitions into automated test definitions. All test results and historical data are retained after transformation.

Integration Enhancements
Test Manager now integrates with Agile management tools. Agile Project Template SilkCentral Test Manager provides an Agile project template that facilitates integration with Agile management tools, for example VersionOne. This template also suggests a possible workflow for using SilkCentral Test Manager in your Agile software development environment. VersionOne VersionOne is a leading Agile management tool. It allows you to manage your user stories in an Agile way. Its integration with Test Manager extends its user-story management capabilities with a powerful testing component that brings both test result and status information to VersionOne. This integration enables you to stay up-to-date with the status of your user stories. Test Manager supports both editions of VersionOne, Agile Enterprise and Agile

20

Team. To integrate VersionOne with a Japanese Test Manager, change the start options of the Application Server service in the registry to -Dfile.encoding=utf-8. Set Integration Default Node for Agile Planning Tools You can now set a folder or container in the test plan tree as the integration default node where you can create tests through a Web Service call from an external Agile planning tool.

General Product Enhancements and Maintenance


The following are the significant enhancements, features, and changes related to general product maintenance.

Usability Enhancements
Usability enhancements have been made throughout SilkCentral Test Manager. Master/Detail Grid Views of Execution Definition Runs and Test Definition Run Test Manager Execution Runs The new grid views on the Runs page offer view settings, including resizing and reordering of columns, filtering, sorting, and grouping options that are configurable on a per-user basis. You can display or hide columns, adjust the width of columns, and move columns around by clicking on a column and dragging it to the desired location. You can use the keyboard to navigate through the runs. The page is now split into two separate sections, one listing the execution definition runs and a second listing the test definition runs that are included in the selected execution definition run. Additional Triggering Information for Execution Definition Runs SilkCentral Test Manager provides additional information related to how execution definition runs are started, for example through the Web or through a schedule. Email Notification of Finished Execution Definition Runs You can now configure email notification for finished execution definition runs of specific execution definitions. Notifications vary based on the result of the execution definition run. Enhanced Requirement Synchronization The time involved in synchronizing CaliberRM requirements has been dramatically reduced. SilkCentral Administration Module now uses enhanced CaliberRM functionality to return only those requirements that include changes. It no longer needs to check through all requirements to identify changes. Project-Context Management of Source Control and Issue Tracking Profiles Source control profiles and issue tracking profiles are now maintained independently for each project in Test Settings . Manager Test-Overload Restriction When a new execution definition run is triggered through a schedule or dependency, Test Manager now skips this new run if another run of the same execution definition, also triggered by a schedule, is currently executed. Test Manager then writes a warning to the application server logfile. This restriction prevents an overload of Test Manager with large amounts of current runs when schedule intervals are too short, especially for schedules on the folder level.

Integration Enhancements
SilkCentral Test Manager has been enhanced to better integrate with other applications.

21

Access to Execution-Definition Run Properties You can now get Test Manager execution definition run properties, for example the name or description of the execution definition, during the execution of SilkPerformer tests. Use the AttributeGet method to access the properties in SilkPerformer scripts. Enhanced VMware Lab Manager Integration Test Manager now enables you to specify the organization to which a user belongs when you configure access to a VMware Lab Manager (Lab Manager) server. Lab Manager uses organizations to determine which resources a user can access. A user with administrator privileges for Lab Manager creates organizations and adds users to organizations. If a user is not assigned to the selected organization in Lab Manager, an error message displays in Test Manager. For more information on the use of organizations in Lab Manager, refer to Lab Manager documentation. Refer to the SilkCentral Administration Module 2009 Help for information on Lab Manager integration with Test Manager.

API Enhancements
The Test Manager API has been enhanced to support additional features. Updating Manual Test Definitions During Upgrade When upgrading to a newer version of SilkCentral Test Manager, you can now update your manual test definitions including all test steps through the Test Manager Web Service API. Refer to the Javadoc for full details regarding available Java classes and methods. To access the Javadoc, click Help Documentation Test Manager API Specification. Web Service Demo Client The Web Service Demo Client is now shipped with Test Manager. Download the client from Test Manager Help Tools. Refer to the Test Manager API Help for more information. New getProductNameById(long sessionId, int productId) Method You can now use the ID of a product to get the name of the product. The new method is located in the sccentities Web Service. Refer to the Javadoc for full details regarding available Java classes and methods. To access the Javadoc, click Help Documentation Test Manager API Specification .

Database Model Enhancements


Test Manager 2009 has been enhanced to deliver results faster and with lower memory usage. Refactored Execution and Test Plan Components in the Test Manager Database Model The new enhanced execution and test-plan components of the Test Manager database model better facilitate the writing of reports on results. Upgrading the database to the current version requires additional time due to these changes. When upgrading, ensure that you have sufficient disk space for the database transaction log. Refer to the Database Model Help for details. Copy Projects Faster With Less Memory Large projects can now be copied faster and with gratly-reduced memory consumption. To use the enhanced copyproject functionality, when you use a Microsoft SQL server as your database server, you have to enable the "snapshot" isolation level. To enable snapshot isolation, log in to your database server as an administrator and execute the following command: ALTER DATABASE <your databasename> SET ALLOW_SNAPSHOT_ISOLATION ON.

22

Documentation Enhancements
The documentation for Test Manager 2009 is also enhanced. Eclipse Help for Test Manager Help Systems The Installlation, API, Database Model, and Office Import Tool Help systems are now available in Eclipse Help format. You can now view Test Manager Help topics alongside other Borland product Help topics in a single, integrated Eclipse Help browser. Having all Help systems on a common delivery platform greatly improves the consistency of Help topics across the tools and makes it easier to find answers to your questions. Improved Documentation Page The Help Documentation page has been improved: You can now access each Help system as Eclipse Help by clicking the Help system's name. Click the PDF link located to the right of the Help system's name to access the Help as a PDF.

Technology Updates
SilkCentral Test Manager now ships with new versions of third-party software. Microsoft SQL Server 2008 SilkCentral Test Manager now supports Microsoft SQL Server 2008. Be sure to set up case-insensitive SQL 2008 servers, because case-sensitive SQL 2008 servers are not supported. Sunset Microsoft SQL Server 2000 SilkCentral Test Manager no longer supports Microsoft SQL Server 2000. Java SilkCentral Test Manager now ships and runs with Java 1.6.0_13. Microsoft Windows Server 2008 SilkCentral Test Manager now supports Microsoft Windows Server 2008 (32 bit). Microsoft Internet Information Services 7 SilkCentral Test Manager now supports Microsoft Internet Information Services (IIS) 7 as its Web Server. IIS 7 was tested for Windows Server 2008 Enterprise Service Pack 1 (English) and Windows Server 2008 Enterprise Service Pack 2 (English). Related Concepts Synchronizing Requirements Working With SilkPerformer Projects Converting Manual Tests to Automated Tests Execution Definition Schedules Related Procedures Set a Test Plan Node as Integration Default for External Agile Planning Tools Related Reference Execution Runs Tab Current Run Page Execution Notifications Page

23

Tour of the UI
Here is an overview of the main elements of Test Manager's user interface.

Basic UI Structure
Test Managers GUI has four main components:

A: Workflow bar - Facilitates the primary actions related to test management. Click an icon to display the

corresponding test management area in the unit window below. The workflow bar is designed around the natural progression of test management activities, from the establishment of new projects and requirements all the way through to issue management and reporting. See the section below for further details. to Administration functions and Help. This tree can be hidden/displayed by clicking the separator bar along the Navigation Tree's right side. The hide/display setting of the navigation tree is saved for each user account. changes based on the unit you are working in and your activities.

B: Navigation Tree - Provides the same functionality offered by the workflow bar. Additionally offers you access C: Unit window - Shows the functional work area of the currently selected Test Manager unit. This view D: Environmental Info - Displays your user name and the active project. Click
to log out of Test Manager.

Workflow Bar
Test Managers workflow bar gives you quick access to Test Managers core functional units (Projects, Settings, Requirements, Test Plan, Execution, Activities, Code Analysis, and Reports). Buttons for each of these units are available on the workflow bar:

24

Workflow Button Description

Projects

Projects Click Projects to go to the Projects unit, which offers a high-level test-managers view of all projects in your Test Manager installation. The Projects unit enables you to move between projects, see high-level project status details, and view current execution statistics. Settings Click Settings to configure system settings (available functionality varies based on your user role) such as filters, project settings, change notification, and more. Requirements Click Requirements to go to the Requirements unit, which enables you to maintain control over your project's requirements during development: managing the creation, modification, and deletion of requirements; association of test definitions with requirements; change history tracking; and the ability to generate test plans directly from requirement lists. Test Plan Click Test Plan to go to the Test Plan unit, which enables you to create and manage test plans, including the creation of test definitions of both automated tests ( SilkPerformer, SilkTest, JUnit, NUnit & WSH) and manual tests. Execution Click Execution to go to the Execution unit, which enables you to configure test execution definitions, schedule test executions, assign test definitions to test executions, set up execution-definition dependencies, and configure execution-server deployment. Activities Click Activities to go to the Activities tab in the Projects unit. The Activities tab displays recently-executed, current, and upcoming execution definition activity on a per-project basis. Code Analysis Click Code Analysis to go to the Code Analysis unit where you can evaluate the degree to which the code in your AUT (Application Under Test) is covered by test cases. You can then make informed estimates regarding effort/cost and risk associated with specific code changes. Reports Click Reports to go to the Reports unit where you can generate reports with SilkCentral Test Manager, download report templates, edit report parameters, and create new reports based on pre-installed templates. Click in the lower-right corner of the workflow bar to view context-sensitive help for the current page.

Settings Requirements

Test Plan

Execution

Activities Code Analysis

Reports

Online Help

Bookmark page Click in the lower-right corner of the workflow bar to bookmark the current Test Manager page. This is especially useful for bookmarking reports, where the current parameters are saved in the bookmarked URL. Print page Click in the lower-right corner of the workflow bar to print any Test Manager page.

Context Menu Commands


Test Manager supports Windows-style context menus across many test management elements (test definitions, requirements, execution definitions, folders, containers, reports, and more). Available through right mouse-click, context menu commands typically include those commands that are available from each unit's toolbar. For elements listed in tree views, context menus offer commands for expanding and collapsing tree view elements. Commands that are not available to selected elements are grayed out.

25

Related Concepts Getting Started Successful Test Management Related Procedures Logging into Test Manager Managing a Successful Test Related Reference User Interface Reference

26

Help on Help
This topic explains how information is organized in Test Manager Help.

Test Manager Help


Test Manager Help includes conceptual overviews and procedural tasks. This allows you to navigate from general to more specific information as needed. Additionally, the persistent navigation panes in the Help window make it easier to locate information.

Concepts
The conceptual overviews provide information about product architecture, components, and other information you need to help you work with Test Manager. At the end of most of the overviews, you will find links to related, more detailed information. An Web icon indicates that a link leads to an external Web site.

Procedures
Procedures provide step-by-step instructions. All procedures are located under Procedures in the Content pane of the Help window. Additionally, most of the conceptual overviews provide links to related procedures.

Typographic Conventions Used in Help


The following typographic conventions are used throughout Test Manager Help.
Convention Used to indicate

Monospace type Source code and text that you must type. Boldface Italics References to dialog boxes and other user interface elements. Identifiers, such as variables. Italicized text is also used to emphasize new terms. A link to Web resources.

Related Concepts Getting Started Successful Test Management Related Procedures Managing a Successful Test Related Reference User Interface Reference

27

Introduction to SilkCentral Test Manager


SilkCentral Test Manager delivers a complete testing solution, from requirements and test-plan management, to test-execution management, code-coverage analysis, issue tracking, and reporting. Test Manager allows you to utilize existing SilkTest and SilkPerformer scripts, and creates scheduled tests based on those scripts. In a few simple steps your test cases can be run automatically, based on individually configurable schedules. In This Section Welcome to SilkCentral Test Manager Manage your testing, from requirements management, through test planning, test execution, code analysis, and issue management with SilkCentral Test Manager. Installing and Licensing Test Manager Installing and licensing SilkCentral Test Manager. SilkCentral Issue Manager SilkCentral Issue Manager, SilkCentrals issue-tracking tool, is fully integrated with Test Manager. Working With SilkPerformer Projects SilkPerformer projects can be integrated into Test Manager test plans and directly executed through Test Manager. Working with Silk Performance Explorer Silk Performance Explorer is used for in-depth analysis of test runs. Software Quality Optimization Test Manager promotes product quality throughout the development cycle. SilkCentral Administration Module Help for SilkCentral Administration Module administrators.

28

Welcome to SilkCentral Test Manager


As a key component of your complete testing solution, Test Managers Requirements unit enables you to maintain control over system requirements during development: managing the creation, modification, and deletion of requirements; association of test definitions with requirements; change-history tracking; and the ability to generate test plans directly from requirement lists. As with all Test Manager functionality, the Requirements unit is 100% Web enabled and accessible through your Web browser. Test Managers Test Plan unit enables you to maintain control over test planning across the system development lifecycle. The Test Plan unit allows you to create and manage test plans, including the definition of both automated (SilkPerformer, SilkTest, JUnit, NUnit, and Windows Scripting) tests and manual tests. Files and links can be uploaded and associated with test containers and definitions as attachments. Issues that are uncovered can easily be associated with the test definitions that led to their discovery. Full history of all changes to test plans is also tracked. The Test Execution unit enables you to configure test scenarios from the Test Plan unit and to schedule those scenarios for execution on your execution servers. Test definitions can be statically assigned to execution definitions or test definitions can be grouped dynamically using predefined filters on the Test Plan tree. Custom schedules can be defined for execution definitions, or predefined schedules can be used. The Code Analysis unit enables you to evaluate the degree to which the code of your AUT is covered by test cases. Intuitive histographs display the percentage of coverage provided for products, code packages, lines of code, class files, and methods. Code-coverage analysis enables you to make informed estimates regarding effort/cost and risk associated with specific code changes. Related Concepts Getting Started Successful Test Management Related Procedures Managing a Successful Test Related Reference User Interface Reference

29

Installing and Licensing Test Manager


For information regarding the installation and licensing of Test Manager, please refer to the Test Manager Installation Help, which is available from both the Test Manager installation CD and the Test Manager download site at Borland.com. Related Concepts Getting Started Successful Test Management Related Procedures Managing a Successful Test Related Reference User Interface Reference

30

SilkCentral Issue Manager


SilkCentral Issue Manager, SilkCentrals issue-tracking tool, is fully integrated with Test Manager, enabling you to correlate issues with system requirements and executed tests. Test-definition issues can be added and managed through the Issues tab in the Test Plan unit (Test Plan View). See Issue Manager documentation for details. Note: Borland StarTeam and IBM Rational ClearQuest are also supported by Test Manager out of the box. Additional issue tracking systems can be configured by installing a custom plug-in. See SilkCentral Test Manager API documentation for detailed information. Related Concepts Issue Management Getting Started Successful Test Management Related Procedures Tracking Issues Managing a Successful Test Related Reference User Interface Reference

31

Working With SilkPerformer Projects


SilkPerformer is fully integrated with Test Manager's test planning and test execution functionality. SilkPerformer projects can be integrated into Test Manager test plans and directly executed through Test Manager. This allows for powerful test-result analysis and reporting. It also enables unattended testing (tests that are run automatically by Test Manager based on pre-configured schedules). See SilkPerformer Help for details on configuring SilkPerformers integration with Test Manager. SilkPerformer project files can be directly opened in SilkPerformer from Test Manager, where scripts and settings can be edited. Edited SilkPerformer projects can subsequently be checked back into Test Manager to make them available for future test executions. Test Manager provides information on execution definition run properties during SilkPerformer test executions. Use the AttributeGet methods to access execution definition run properties in the SilkPerformer script. You can access the following properties in the script:

#sctm_execdef_id #sctm_execdef_name #sctm_product #sctm_version #sctm_build #sctm_keywords


Note: The term Project is used differently in SilkPerformer than it is in Test Manager. A SilkPerformer project, when uploaded to Test Manager, becomes the core element of a Test Manager test definition. Test Manager projects are high-level entities that may include multiple SilkPerformer projects, test definitions, execution definitions, and requirements. Related Concepts Getting Started Successful Test Management Related Procedures Working with SilkPerformer Projects Creating Test Definitions Managing a Successful Test Related Reference User Interface Reference

32

Working with Silk Performance Explorer


Silk Performance Explorer is used for in-depth analysis of test runs. Performance Explorer results analysis can be started directly from Test Manager's Executions unit and Test Plan unit through execution runs on the Runs tab or from Performance Explorer itself. See Performance Explorer documentation for details regarding Performance Explorer's integration with Test Manager. The results of load-test runs in SilkPerformer can also be uploaded to Test Manager and associated with test definitions. See SilkPerformer Help for more details. For additional information about Test Managers integration with SilkPerformer, see SilkPerformer Help and the Performance Explorer User Guide. Related Concepts Getting Started Successful Test Management Related Procedures Working with SilkPerformer Projects Creating Test Definitions Managing a Successful Test Related Reference User Interface Reference

33

Software Quality Optimization


Today's e-business systems are increasingly complex, and reliability is more important than ever. Therefore, assuring product quality throughout the development cycle is an important key to success. The best way to check a products quality over time is to perform key tests on a daily basis. Test Manager helps in this regard by automating test executions that follow freely configurable schedules, both during product development and after deployment. By providing reports with different levels of detail, checking the status of products in development is as straightforward as checking an HTML report in a Web browser. With complex software projects, thorough testing of new builds is critically important. Test Manager saves time and man hours by automating this process. Related Concepts Getting Started Successful Test Management Related Procedures Managing a Successful Test Related Reference User Interface Reference

34

SilkCentral Administration Module


SilkCentral products are based on SilkCentral Architecture (SCA), which allows for common administration of Webbased products. These administrative tasks are performed through the SilkCentral Administration Module. In This Section SilkCentral Architecture This section provides an overview of SilkCentral's architecture. Access and Licensing This section provides a brief overview of how to license SilkCentral.

35

SilkCentral Architecture
SilkCentral products are based on SilkCentral Architecture (SCA), which allows for common administration of Webbased products. The following sections describe the SilkCentral components.

Overview Front-end server Application server Execution server Chart server Database server

Overview

Front-end server
The front-end server is responsible for the graphical user interface. This server is based on HTML and is accessible from any Web browser, such as Internet Explorer or Firefox. A user sends an appropriate HTTP request to the frontend server and receives a login page for authentication. After successful login, the user can use the corresponding

36

application based on the respective user rights. The front-end server can operate as a stand-alone HTTP server, or it can be attached to a Web server, such as IIS via ISAPI filter.

Application server
The application server synchronizes tasks such as the distribution of schedules, control of execution servers, and management of database configuration. These tasks require a centralized agency to ensure the consistent, reliable behavior of the application. The application server also evaluates results, saves them to the database, and sends alerts based on success conditions.

Execution server
The execution server executes SilkTest and/or SilkPerformer tests that are scheduled by authorized users. Users are responsible for the proper configuration of execution servers and additional resources that are required for test executions. The system allows for the installation and configuration of multiple execution servers working independently of one another.

Chart server
The chart server is used to generate charts that are viewed in reports. The system allows for the configuration of a pool of chart servers. A built-in load balancing mechanism uses the pool to distribute chart generation. The chart server is also used to generate reports and deliver them directly to the end-user for viewing within a browser.

Database server
System persistency is implemented using a RDBMS (Relational Database Management System). SilkCentral supports MS SQL Server 2005 and 2008 (including Express), Oracle 9i (version 9.2.0.8 or later), and Oracle 10g (Borland recommends version 10.2).

Agent computers
SilkPerformer and SilkTest agent computers are assigned to particular SilkPerformer / SilkTest projects from the pool of agent computers that are available to the controller computer. In combination with SilkCentral Test Manager, the controller computer acts as an execution server.

SilkPerformer agents
SilkPerformer agent computers host the virtual users that are run during load tests. As many agent computers as necessary can be added to a SilkPerformer project so that the required quantity of virtual users can be run. Configuration of agents is done through SilkPerformer. See SilkPerformer documentation for details on configuring agents.

SilkTest agents
The same rules that apply to SilkPerformer agents apply to SilkTest agents, except SilkTest agents host SilkTest tests.

37

Related Concepts Getting Started Successful Test Management Related Procedures Managing a Successful Test Related Reference User Interface Reference

38

Access and Licensing


SilkMeter, the licensing software that accompanies Borlands products, determines the SilkCentral-application functionality that you may access. For more information on licensing, see the respective products installation instructions. Related Concepts Getting Started Successful Test Management Related Procedures Managing a Successful Test Related Reference User Interface Reference

39

Procedures
This section explains the tasks you must perform before you can begin using Test Manager. In This Section Configuring Browser Settings How to optimize your browser settings for use with Test Manager. Logging In and Out of Test Manager How to log into and out of SilkCentral Test Manager.

40

Configuring Browser Settings


For optimal performance when working with Test Manager, ensure that your browser does not reload Web pages after each update.

To cache information with Microsoft Internet Explorer:


1 2 3 4 5 6

Select Start

Settings

Control Panel

Internet Options.

On the Internet Properties dialog box, select the General tab, if not already selected. In the Temporary Internet files area (Browsing history area in Microsoft Windows XP) , click Settings. The Settings dialog box displays. In the Check for newer versions of stored pages section, select Automatically. Click OK. Click OK again on the following dialog box.

Note: When running Internet Explorer 6 with Service Pack 1 on a Windows 2003 system, you may experience that Web page contents appear black when you open a dialog box. This is caused by a security feature that was introduced with Internet Explorer Service Pack 1. To remedy this issue, add theTest Manager server to your list of trusted sites:

Configuring IE for Windows 2003


1 2 3 4 5 6 7

Select Start

Settings

Control Panel

Internet Options.

On the Internet Properties dialog box, select the Security tab. Select the Trusted sites icon. Click Sites. Enter the URL of your SilkCentral Test Manager host in the Add this Web site to the zone field (for example, http://MyTestManagerHost) Click Close. Click OK to complete the configuration.

Related Concepts Getting Started Successful Test Management Related Procedures Managing a Successful Test Related Reference User Interface Reference

41

Logging In and Out of Test Manager


Procedures explaining how to log into and out of SilkCentral Test Manager. In This Section Logging into Test Manager How to log into Test Manager. Logging out of Test Manager How to log out of Test Manager.

42

Logging into Test Manager


To log into Test Manager:
1

Navigate to the IP address or URL of your Test Manager installation. Note: Speak to your system administrator about the URL, username, and password you should use to log into SilkCentral Test Manager.

2 3 4

On the Test Manager login page, enter your Username and Password . Check the remember login check box to have Test Manager auto-complete usernames and remember your password when you begin typing your username. Click Login to begin working with Test Manager.

Note: When logging in for the first time, you will be directed to the Project Overview in the Projects unit. Upon subsequent login, you will automatically be directed to the URL you were visiting when you logged out of Test Manager during your previous visit. For example, if when you previously logged out of Test Manager you had a certain test definition selected, you will automatically be directed to that test definition upon login. Related Concepts Getting Started Successful Test Management Related Procedures Managing a Successful Test Related Reference User Interface Reference

43

Logging out of Test Manager


To log out of Test Manager
1 2

Click Log Out in the upper-right corner. Your user session will then be terminated.

Related Concepts Getting Started Successful Test Management Related Procedures Managing a Successful Test Related Reference User Interface Reference

44

Quick Start Tasks


Quick Start Tasks are high-level overviews of the main tasks that you will likely need to perform with SilkCentral Test Manager. These procedures can serve as tutorials in guiding you step-by-step through best practice use of Test Manager's core functionality. In This Section Analyzing Test Results - Quick Start Task How to analyze test results in SilkCentral Test Manager. Configuring Projects - Quick Start Task How to configure projects in SilkCentral Test Manager. Managing Requirements - Quick Start Task How to manage requirements in SilkCentral Test Manager. Managing Test Executions - Quick Start Task How to manage test executions in SilkCentral Test Manager. Managing Test Plans - Quick Start Task How to manage test plans in SilkCentral Test Manager.

45

Glossary
General Terminology
Following are definitions of general Test Manager terms. Application server Server that handles all internal processing Assigned test definition Test definition that is assigned to an execution definition for execution, or to a test requirement. Attribute Attributes are used to tag test plan elements. Tags can later be used to filter test plan elements. Component The part of a product that a test plan is created for. Component parameters can be used to categorize test definitions and consequently raise the expressiveness of test plans. Deployment Determination as to where your execution definition should be run. Deployment is realized by assigning an execution server to an execution definition. Execution server The server where your execution definition will be run. Execution servers are defined during deployment. Filter Used to select specific tree elements. Filter criteria may include properties such as attributes and types. Front-end server Server that handles communication between the application and the user through the GUI. Location A physical location that has one or more execution servers where execution definitions can be run. Log A file to which all of a servers activity is recorded for the purposes of diagnostics. Parameter An input type that is required by a test definition. Product An end-product to which your companys efforts are focused. Products typically consist of one or more components. Project A Test Manager entity within which associated efforts are consolidated. All user actions are associated with projects. Defining a project is the first step in test management with Test Manager. Reports Graphic- and table-based documents that present data in a meaningful way. Specific reports are available for requirements management, test planning, and test-execution management. Report server Server that provides report processing and report presentation. Schedule Specifies the times and frequency at which test executions will be run. Toolbar A graphical display of buttons that reflects available functionality. The appearance of the toolbar varies based on the selected workflow unit and tree element. This is not the same as the workflow bar. User Person who works with Test Manager. User types endow users with specific rights. Workflow bar The bar at the top of the page that represents Test Managers core functional units.

Requirements Management Terminology


Following are definitions related to requirements management. Requirement Represents a customer demand or market need. It is the basic unit of requirements management and ultimately serves as the basis for testing. Child requirement An element of the Requirements tree that is subordinated to a parent requirement. Parent requirement An element of the Requirements tree that has one or more child requirements. Requirement tree Tree-shaped interface for organizing requirements.

46

Test Plan Terminology


Following are definitions related to test plan management. Test definition The basic test plan unit. Test definitions can be of various types: SilkTest, SilkPerformer, Manual, JUnit, NUnit, and WSH. Test definitions may be children of test containers and/ or test folders. Test definitions may have ordered parameter lists for execution, references to attributes, requirements, and other fields for user information (e.g., description, author, dates, status, product, component, platform). Manual test definitions define all the steps required for execution of a manual test. Test folder Second-level test-plan structuring element. Test folders may contain test definitions and/ or additional test folders. Unlike test containers, test folders are optional test plan elements. Test container The required top-level structuring element of test plans. Test containers may contain test folders and test definitions. This object may also have references to attributes, symbols, requirements, or other fields for user information (e.g., description, author, dates). Test Plan tree Tree-shaped interface used to organize test plan elements. Attribute A site-specific characteristic that has one or more values. You can define attributes as testselection filters by assigning them to test definitions, test folders, or test containers. Each attribute has a set of values. Attributes are useful for grouping tests, in that you can filter for specific test folder contents that share a given attribute value. Examples: Level=Smoke, Full, Regression; Browser=IE6.0, IE5.5, Mozilla1.7 Source control profile Source control profiles allow you to define where Test Managers execution servers are to retrieve program sources from external source control systems for test execution. Currently, the following source control systems are supported by Test Manager out of the box: Borland StarTeam, Serena Version Manager (PVCS), Content Version System (CVS), Microsoft Visual SourceSafe (MSVSS), Universal Naming Convention (UNC.). Additional source control systems can be configured by installingcustom plug-ins. See SilkCentral API documentation for details. Symbol A variable reference in the test container tree. This is used to specify a value for a test parameter that is to be used in the test parameter list for future execution. Symbols that are proceeded in SilkTest plan files with a "$" symbol may have one value at a time, and can be of any type (represented as a string). Examples: $HTMLControl=HTMLText; $BT=IE5.5; $OS=Win2000. Testcase In a script file, a testcase is an automated test that ideally addresses one test requirement. For example, a SilkTest 4Test function that begins with the testcase keyword and contains a sequence of 4Test statements. A testcase drives an application to the state to be tested, verifies that the application works as expected, and returns the application to its base state. In a test plan, a testcase is a keyword, the value of which is the name of a testcase defined in a script file. Used in an assignment statement to link a test description in a test plan with a 4Test testcase defined in a script file. Testcase names can have a maximum of 127 characters. When you create a data driven testcase, SilkTest truncates test case names that are greater than 124 characters.

Execution Terminology
Following are definitions related to test definition executions. Execution definition The basic unit of the Execution tree that has one or more test definitions from the Test Plan unit assigned for execution. Executions may be children of folders or they may be standalone. Execution tree Tree-shaped interface used to organize execution definitions.

47

Folder

An Execution tree structuring element. Folders are used to store related execution definitions.

Related Concepts Getting Started Successful Test Management Related Procedures Managing a Successful Test Related Reference User Interface Reference

48

49

Concepts
This section contains all the conceptual topics associated with using SilkCentral Test Manager. In This Section Successful Test Management This section includes all the conceptual topics that are related to the operation of SilkCentral Test Manager.

50

Successful Test Management


This section includes all the conceptual topics that are related to the operation of SilkCentral Test Manager. In This Section Settings Configuration This section explains how to configure settings in Test Manager. Requirements Management This section explains how to manage requirements in Test Manager. Filtering This section explains how to filter requirements, test definitions or execution definitions in Test Manager. Test Plan Management This section explains how to manage test plans in Test Manager. Test Definition Execution This section explains how to execute test definitions in Test Manager. Issue Management The Issues unit helps you track the issues that are associated with the selected project. Project Management This section explains how to manage projects in Test Manager. Report Generation This section explains how to generate and view SilkCentral Test Manager reports. Code Coverage Analysis This section explains how to analyze code coverage withSilkCentral Test Manager.

51

Settings Configuration
This section explains how to configure settings in Test Manager. If you have SuperUser, Administrator, or Project Manager privileges, you can specify project-wide settings for SilkCentral Test Manager projects. Once global settings are defined, they are available to all users who have access to those projects. Global project settings include the definition of filters, attributes, external product integrations, change notifications, and more. If you have SuperUser, Administrator, or Project Manager privileges, you can specify project-wide settings for build information, source files, file extensions, and more. In This Section Global Filters Filters provide an efficient means of finding exactly the information you need, while excluding extraneous detail. Attributes Custom attributes can be used to customize information for test definitions. Custom Requirement Properties You can add custom property fields across all requirements in a selected project. Custom Step Properties You can add custom property fields across all manual test steps in a selected project. Change Notification Test Manager can notify you by email when requirements or test plans are changed by other users. Requirements Integration Configuration External requirements-management integration enables you to coordinate Test Managers requirementsmanagement features with other requirements-management tools. Data Sources for Data-Driven Tests Data-driven tests are tests that are derived from values in an existing data source, such as a spreadsheet or a database. Issue Tracking Profiles Issue tracking profiles enable SilkCentral Test Manager to integrate with external issue tracking systems. Source Control Profiles Source Control profiles enable Test Manager to integrate with external source control systems.

52

Global Filters
Filters provide an efficient means of finding exactly the information you need, while excluding extraneous detail. By defining global filters, you can create complex filter criteria that are available through out Test Manager without requiring you to define filter criteria each time you need to filter a list. Related Concepts Settings Configuration Filters Related Procedures Configuring Projects - Quick Start Task Configuring Test Manager Settings Related Reference Filters tab

53

Attributes
Custom attributes can be used to further customize information for test definitions (in the Test Plan unit). While some attributes are made available by Test Managers integrated functionality, such as priority, components, and platforms, you may want to define custom attributes to categorize test definitions to your needs, or to make test definitions compatible with SilkTest test cases. Related Concepts Settings Configuration Related Procedures Configuring Projects - Quick Start Task Configuring Custom Attributes Configuring Test Manager Settings Related Reference Attributes tab

54

Custom Requirement Properties


You can add custom property fields across all requirements in the selected project through Test Managers Settings unit. Custom properties can subsequently be edited alongside the default properties on the Edit Requirements dialog box. Custom properties are displayed (on the Properties tab) in Requirements View, but not in Document View. Related Concepts Settings Configuration Related Procedures Configuring Projects - Quick Start Task Configuring Custom Requirement Properties Configuring Test Manager Settings Related Reference Requirement Properties Page

55

Custom Step Properties


You can add custom property fields across all manual test steps in a selected project using Test Managers Settings unit. Custom step properties can subsequently be edited alongside the default properties on the Edit Manual Test Definition Step dialog. Related Concepts Settings Configuration Related Procedures Configuring Projects - Quick Start Task Creating Custom Step Properties Configuring Test Manager Settings Related Reference Step Properties Page

56

Change Notification
Test Manager can notify you by email when requirements or test plans are changed by other users. Each user has the option of activating change-notification. So that you are not bombarded with numerous notifications, when a change is made only a single email alert is sent to you regardless of how many changes other users may have made since your last acknowledgment. Email alerts include links that take you directly to a view of recent changes. Before you can activate requirements or test plan change notification, you must configure your email address in Test Managers user settings. Please see SilkCentral Administration Module documentation for details. Note: Change notification only works if an email server has been configured by your administrator. If change notification has not been enabled, please contact your SilkCentral administrator. Note: Once notification has been enabled, you can view and acknowledge changes that have occurred since your last acknowledgment. Related Concepts Settings Configuration Related Procedures Configuring Projects - Quick Start Task Enabling Change Notification Configuring Test Manager Settings Related Reference Notifications Page

57

Requirements Integration Configuration


External requirements-management integration enables you to coordinate Test Managers requirementsmanagement features with other tools you may already be working with. External requirements-management integration is currently pre-installed for Borland CaliberRM, IBM Rational RequisitePro, and Telelogic DOORS. Integration is configured through the Integrations Configuration tab in Test Managers Settings unit. The tab view is divided into a separate section for each installed plug-in. Pre-installed, the view is divided into three sections, one for Borland CaliberRM, one for IBM/Rational, and one for Telelogic DOORS. Test Manager supports integration with external requirements-management systems (RMS) through Test Managers open interface. Creating a plug-in and integrating it into Test Manager allows integrating any RMS intoTest Manager. Before you can configure CaliberRM integration, you must install the CaliberRM client on the SilkCentral application server and on the front-end server. Additionally, make sure that MPX support is enabled in CaliberRM. Before you can configure RequisitePro integration, you must install the IBM Rational RequisitePro client on the SilkCentral front-end server. Before you can configure DOORS integration, you must install the Telelogic DOORS client on the SilkCentral frontend server. Note: Test Manager currently supports CaliberRM versions 2005 and 2006, RequisitePro versions 2002.05.00 and 2003.06.12, and DOORS version 8.0. Note: The Tools unit (Help module) enables download of add-ins for Borland CaliberRM, Rational RequisitePro, and Telelogic DOORS. These add-ins can be installed on appropriate server and client computers. For details, see the corresponding ReadMe files (included in the downloadable archives).

The Add-In for RequisitePro enhances the RequisitePro menu with an entry providing a link to the Test Manager
front-end servers project selection.

The Add-In for CaliberRM enables CaliberRM with external traceability to Test Manager requirements. The
add-in must be installed on each CaliberRM server and client. the DOORS client on the SilkCentral front-end server.

The Add-In for DOORS enables Test Manager to communicate with DOORS. This add-in must be installed on
Note: When using CaliberRM 2006 or higher, the integration with Test Manager is set up out of the box. It is still recommended to install the add-in that is available from the Help Tools menu in Test Manager to make sure that you have the latest version of the integration installed. Note: Configuring integration with CaliberRM requires the definition of CaliberRM login credentials. Whenever requirements are synchronized between Test Manager and CaliberRM, these credentials are used to login to CaliberRM, thus checking out a CaliberRM license. The license is set free as soon as the synchronization process has completed. We recommend creating a dedicated CaliberRM user for synchronization purposes, which should be used by all Test Manager integration configuration. This ensures that only a single CaliberRM license is used for the process of synchronization.

58

Related Concepts External Requirements Management Tools Settings Configuration Related Procedures Configuring Projects - Quick Start Task Integrating External RM Tools Configuring Test Manager Settings Related Reference APIs

59

Data Sources for Data-Driven Tests


Data-driven tests are tests that are derived from values in an existing data source, such as a spreadsheet or a database. Data sources are managed in a project-specific scope. Tip: To make Test Manager aware of changes in your data source, you must synchronize your data source profile with your data source whenever your data source is updated or changed. See the related Synchronizing Data Sources procedure for details. Related Concepts Data-Driven Tests Related Procedures Configuring Microsoft Excel or CSV Data Sources Configuring JDBC Data Sources Synchronizing Data Sources Uploading Updated Excel Files to a Data Source Downloading Excel Files from a Data Source Deleting Data Sources Related Reference Data Sources Configuration Page

60

Issue Tracking Profiles


Issue tracking profiles enable SilkCentral Test Manager to integrate with external issue tracking systems. The issue tracking software packages that are currently supported by Test Manager out of the box are:

SilkCentral Issue Manager - see the related Managing SilkCentral Issue Manager Issue Tracking
Profiles topic.

Borland StarTeam - see the related Managing Borland StarTeam Issue Tracking Profiles topic. IBM Rational ClearQuest - see the related Managing IBM Rational ClearQuest Issue Tracking
Profiles topic.

Issue Tracking Web Service - see the Test Manager API Help. Bugzilla - see the see the related Managing Bugzilla Issue Tracking Profiles topic.
Additional issue tracking systems can be configured by installing a custom plug-in (see the Test Manager API Help for detailed information). Defining issue tracking profiles allows you to link test definitions within the Test Plan unit to issues in third-party issue-tracking systems. Linked issue states are updated periodically from the third-party issue tracking system.

SilkCentral Issue Manager Profiles


SilkCentral Issue Manager is a robust issue tracking tool that manages bug fixes and enhancement issues related to your companys software projects. Being fully customizable, SilkCentral Issue Manager meets the challenges of your business environmentworking across multiple products, releases, and locations. Because SilkCentral Issue Managers flexible defect tracking workflow allows development and quality assurance to work more closely together, it increases productivityresulting in an improved development process. Integration is supported for Issue Manager 2006 or higher.

Borland StarTeam Profiles


Borland StarTeam, a software change management and configuration management tool, enables coordination and management of the software delivery process. Integration is supported for StarTeam 2005 R2 or higher. Tip: To work with StarTeam profiles and use the go to link functionality for change requests in StarTeam, you must have the StarTeam Cross-Platform Client software installed on the computer where the browser is running.

IBM Rational ClearQuest Profiles


IBM Rational ClearQuest products provide flexible defect/change tracking and automated workflow support. The two key products are IBM Rational ClearQuest and IBM Rational ClearQuest MultiSite. Integration is supported for ClearQuest versions 2002.05.20 and 2003.06.15. Tip: To work with Rational ClearQuest profiles, you must have the Rational ClearQuest client software installed on the computer where the SilkCentral Front-end Server is running. Please refer to Rational ClearQuest documentation for detailed information about installing Rational ClearQuest.

61

Related Procedures Managing SilkCentral Issue Manager Issue Tracking Profiles Managing Borland StarTeam Issue Tracking Profiles Managing IBM Rational ClearQuest Issue Tracking Profiles Managing Bugzilla Issue Tracking Profiles Related Reference Issue Tracking Profiles Page

62

Source Control Profiles


Source Control profiles enable Test Manager to integrate with external source control systems. Defining source control profiles allows you to define where Test Managers execution servers should retrieve program sources for test execution. You need double the amount of free disk space on the execution server to accommodate the source files because SilkCentral checks out the source control tree and then generates a working directory with the executable source files. The source control systems that are currently supported by Test Manager out of the box are:

Borland StarTeam - see the related Managing Borland StarTeam Source Control Profiles topic. Serena Version Manager (PVCS) - see the related Managing Serena Version Manager (PVCS)
Profiles topic.

Concurrent Version System (CVS) - see the related Managing CVS Profiles topic. Microsoft Visual SourceSafe (MSVSS) - see the related Managing Microsoft Visual SourceSafe
(MSVSS) Profiles topic.

Universal Naming Convention (UNC) (file-system access) - see the related Managing UNC Profiles topic. Subversion - see the related Managing Subversion Profiles topic. Apache Commons Virtual File System (VFS) - see the related Managing VFS Profiles topic.
Additional source control systems can be configured by installing a custom plug-in. Refer to the Test Manager API Help for detailed information.

Borland StarTeam
StarTeam promotes team communication and collaboration through centralized control of all project assets. Protected yet flexible access ensures that team members can work whenever and wherever they like through an extensive choice of Web, desktop, IDE, and command-line clients. StarTeam offers a uniquely comprehensive solution that includes integrated requirements management, change management, defect tracking, file versioning, threaded discussions, and project and task management. Integration is supported for StarTeam version 2005 R2 or higher.

Serena Version Manager (PVCS)


Serena Version Manager, from the makers of PVCS, is the full-featured solution for version control and revision management in software projects. More than simply storing code revisions, Version Manager is a robust, full-featured solution with security, high performance, and varying levels of support for distributed teams. Integration is supported for PVCS version 8 or higher.

Concurrent Version System (CVS)


CVS is a powerful source control tool that handles complete source code trees. It can be customized using scripting languages such as PERL and Korn. CVS is decentralized so that users can maintain their own source directory trees. It also enables concurrent file editing. Integration is supported for CVSNT Client version 2.5.

Microsoft Visual SourceSafe (MSVSS)


Microsoft Visual SourceSafe is a version-control system for managing software and Web-site development. Fully integrated with the Visual Basic-, Visual C++-, Visual J++-, Visual InterDev-, and Visual FoxPro development 63

environments, as well as with Microsoft Office applications, MSVSS provides easy-to-use, project-oriented version control. Visual SourceSafe works with any file produced with any development language, authoring tool, or application. Users can work at both the file and project level while promoting file reuse. Integration is supported for MSVSS versions 6 and 2005.

Universal Naming Convention (UNC)


Short for Universal Naming Convention or Uniform Naming Convention, UNC is a PC format for specifying the location of resources on a local-area network (LAN). UNC uses the following format: \\server-name\shared-resource-pathname So, for example, to access the file test.txt in the directory examples on the shared server silo, you would write: \\silo\examples\test.txt You can also use UNC to identify shared peripheral devices, such as printers. The idea behind UNC is to provide a format so that each shared resource can be identified with a unique address. UNC is only supported on Microsoft Windows operating systems. If you plan to use a non-windows execution server you can use the Apache Commons VFS source control profile instead.

Subversion (SVN)
Subversion (SVN) is the successor to the Concurrent Versions System (CVS). Subversion manages versions using transaction numbers. With each commit, the transaction number is incrementally increased. What other source control systems call labels, Subversion refers to as tags. These tags are encoded in the Subversion URL. For example, http://MyHost/svn/MyApp/trunk is a Subversion URL and http://MyHost/ svn/MyApp/tags/build1012 is a Subversion tag. Test Manager supports Subversion tags. If a Subversion URL contains the trunk directory, you can define a label tags/build1012. This label replaces trunk in the Subversion URL. Note: If a Subversion URL does not contain trunk and you define a label, Test Manager throws an error.

Apache Commons Virtual File System (VFS)


A Virtual File System (VFS) is an abstraction layer on top of a more concrete file system. The purpose of a VFS is to allow client applications to access different types of concrete file systems in a uniform way. Apache Commons VFS provides a single API for accessing various file systems. It presents a uniform view of the files from various sources. The protocols that are currently supported for VFS by Test Manager are:
Protocol Description

http

ftp smb

Copies the given file. This protocol type is also supported for copying and unpacking ZIP, JAR, or other zipped files. It is required to specify a .zip file on a http server. For Example, zip:http:// myTestServer/myTests.zip. The .zip file will be extracted on the execution server. Copies the given file. This protocol type is also supported for copying and unpacking ZIP, JAR, or other zipped files. Server Message Block (smb) copies all files and folders.

64

Related Procedures Managing Borland StarTeam Source Control Profiles Managing CVS Profiles Managing Microsoft Visual SourceSafe (MSVSS) Profiles Managing Serena Version Manager (PVCS) Profiles Managing UNC Profiles Managing Subversion Profiles Managing VFS Profiles Related Reference Source Control Profiles Page

65

Requirements Management
This section explains how to manage requirements in Test Manager. SilkCentral Test Managers Requirements unit enables you to maintain control over system requirements during development: managing the creation, modification, and deletion of requirements; association of test definitions with requirements; change history tracking; and the ability to generate test plans directly from requirement lists. As with all Test Manager functionality, the Requirements unit is 100% Web enabled and accessible through a Web browser. In This Section Requirements Tree Requirements are displayed, organized, and maintained through a hierarchical tree structure. Attachments You can upload multiple files or links as attachments to requirements. Full Coverage and Direct Coverage Modes Full Coverage mode offers a cumulative view of test-definition-to-requirement coverage. Direct Coverage mode requirement status is calculated only by considering the test definitions that are assigned directly to requirements. Test Coverage Status Shows the status of all tests that have been assigned to the requirement (number and percentage of Passed, Failed, Not Executed, and Not Covered tests). Requirements Reports This section explains the requirements-related reports. Microsoft Office Requirement-Import Tool Assists you in importing requirements from Microsoft Word and Microsoft Excel. Test Plan Generation Test plans can be generated directly from the Requirements tree and test definitions can be assigned to specific requirements. Requirement History Test Manager provides a complete history of all changes that are made to requirements. Change-Notification Emails You can configure email notifications to alert you to changes that are made to requirement settings and/or test-plan settings for specified projects. External Requirements Management Tools This section explains how to work with external requirements management tools.

66

Requirements Tree
Requirements are displayed, organized, and maintained through a hierarchical tree structure, the Requirements tree. Each node in the Requirements tree represents a requirement. Each requirement can have any number of child requirements associated with it. The Requirements tree enables you to organize requirements in any number of hierarchical levels. Note: When the Requirements tree includes more elements than can be displayed at once without impacting response time, elements are displayed in increments. Page number links at the bottom of the tab allow you to browse through the elements included in the tree one page at a time. To display all elements as a single list, select the [All] link. Related Concepts Requirements Management Related Procedures Managing Requirements - Quick Start Task Collapsing or Expanding the Requirements tree Managing Requirements Related Reference Requirements Unit Interface

67

Attachments
You can upload multiple files or links as attachments to requirements. You can edit the descriptions of attachments or delete attachments. When you cut and paste requirements that have attachments, the attachments are automatically included with the copies. The following attachment types are available:

Uploaded Files (.gif, .png, .jpg, .doc, .rtf, .txt, .zip, .xls, .csv, and more) References to UNC paths References to URLs, including StarTeam URLs
Related Concepts Requirements Management Related Procedures Managing Requirements - Quick Start Task Attaching a File to a Requirement Managing Requirements Related Reference Requirement Attachments tab

68

Full Coverage and Direct Coverage Modes


Full Coverage mode offers a cumulative view of test-definition-to-requirement coverage that considers the status of all child requirements of parent requirements. If one or more child requirements has a status of Not Covered, then the full coverage status of the selected requirement will also be Not Covered, even if the coverage status of the parent requirement is Covered. Full Coverage mode enables easy evaluation of whether or not requirements are covered by test definitions. In Direct Coverage mode requirement status is calculated only by considering the test definitions that are assigned directly to requirements. Child requirements are not factored into calculations. Related Concepts Requirements Management Related Procedures Managing Requirements - Quick Start Task Switching Between Full and Direct Coverage Modes Managing Requirements Related Reference Requirement Coverage tab

69

Test Coverage Status


The Coverage tab (Requirements View only) displays basic properties of the selected requirement (Name, Priority, Risk) in addition to the status of all tests that have been assigned to the requirement (number and percentage of Passed, Failed, Not Executed, and Not Covered tests). A summary of all assigned tests is listed under Total tests. To view the status of all tests that are assigned to child requirements of the selected requirement in addition to all tests that are directly assigned to the requirement, check the Full coverage check box. Document View displays the same coverage status information in a heat field chart, with green indicating passed tests; red indicating failed tests; brown indicating tests that have not yet been executed; and gray indicating tests of other status. Requirements that are not covered by test definitions are listed as Not covered. Note: With both Document View and Requirements Views Test Coverage tab, test definition totals accumulate to the parent level (for example, requirement totals include test definitions from child requirements; project totals include test definitions from all requirements. Related Concepts Requirements Management Related Procedures Managing Requirements - Quick Start Task Switching Between Full and Direct Coverage Modes Managing Requirements Related Reference Requirement Coverage tab

70

Requirements Reports
This section explains the requirements-related reports that ship withSilkCentral Test Manager. Requirements reports detail the status of functional requirements (for example, compatibility requirements, GUI requirements, feature requirements) that must be met during development. Requirements may also relate to product management objectives such as reliability, scalability, and performance. Test Managers requirement-management reports help managers determine if adequate test coverage has been established to verify that system requirements are met during development. When a report references a requirement that includes HTML-formatted content, that content is rendered in the report. The following reports come pre-installed with Test Manager. In This Section Status Reports Here are the status reports that are available for Test Manager's Requirements unit. Progress Reports Here are the progress reports that are available for Test Manager's Requirements unit. Document Reports Here are the document reports that are available for Test Manager's Requirements unit. All Related Issues Report Provides a detailed list of all issues related to the assigned test definitions for a requirement.

71

Microsoft Office Requirement-Import Tool


For detailed information about importing requirements from Microsoft Word and Microsoft Excel, see Test Manager Office Import Tool documentation. Related Concepts Requirements Management Related Procedures Managing Requirements - Quick Start Task Managing Requirements Related Reference Requirements Unit Interface

72

Test Plan Generation


With Test Manager, test plans can be generated directly from the Requirements tree; and test definitions can be assigned to specific requirements. The Requirements tree serves as a template for the test-folder/test-definition structure of the Test Plan tree. Related Concepts Generating Test Plans from Requirements View Requirements Management Related Procedures Managing Requirements - Quick Start Task Managing Requirements Related Reference Requirements Unit Interface

73

Requirement History
History Test Manager provides a complete history of all changes that are made to requirements. History information is read-only, and cannot be edited or permanently deleted. The following actions generate requirement history entries:

Adding requirements Editing requirements Marking of requirements as Obsolete Adding Attachments Deleting Attachments Importing/updating requirements through MS Word or MS Excel
The deletion of a requirement using the Destroy permanently option or the deletion of a requirement that has already been marked as obsolete generates a history entry at the project level (on the project node), because the requirement for which the history relates has been deleted from the database. Each requirement-revision entry includes:

Revision number (1-n) Change date/time User who changed the requirement Notes describing the revision Project-Level History
Note: When a requirement has been deleted, or if you acknowledge all recent changes, a change history entry is added to that projects history file. Note: The Recent Changes filter (accessible through on the toolbar) enables you to efficiently view and acknowledge the latest changes and additions that have been made to requirements. Related Concepts Tracking the History of a Requirement Requirements Management Related Procedures Managing Requirements - Quick Start Task Managing Requirements Related Reference Requirement History tab

74

Change-Notification Emails
You can configure email notifications to alert you to changes that are made to requirement settings and/or test-plan settings for specified projects since you last confirmed changes with a change acknowledgement function. Following your logout, an email alert is sent to you each time one of the following settings is changed:

Requirements unit
A requirement is created or deleted. The name or description of a requirement is edited. A system property is edited. A requirement is set as obsolete. A requirement is recovered. A test definition is assigned to or removed from a requirement. A custom property of a requirement is created, edited, or deleted.

Test plan unit


A container is created or edited. A product is edited. A source control profile is edited. "Clear working folder" is edited. The root node is edited. The custom data directory is edited. The include directory is edited. The hidden test properties are edited. The SilkTest interface is edited. A test folder is created, edited, or deleted. A test definition is created, edited, or deleted. The planned time is edited. A test step is added or edited. Speak to your system administrator regarding the setup of emailed change notifications. Email alerts include links that take you directly to the changes that have been made. Change notification is configured through the Settings unit. Speak to your administrator about configuring the appropriate email address for your user profile.

75

Related Concepts Tracking the History of a Requirement Requirements Management Related Procedures Managing Requirements - Quick Start Task Enabling Change Notification Managing Requirements Related Reference Requirement History tab

76

External Requirements Management Tools


This section explains how to work with external requirements management tools. In This Section External Requirements Management Tools Test Managers external RM integration enables you to exchange projects between Test Manager and other external RMS. Synchronizing Requirements Ensure that requirement-property fields are synchronized between Test Managerand external requirements management tools. CaliberRM Integration with Test Manager This section explains how to work with CaliberRM's integration with Test Manager.

77

External Requirements Management Tools


Test Managers external RM integration enables you to exchange projects between Test Manager and other external RMS (Borland CaliberRM, IBM Rational RequisitePro, and Telelogic DOORS are pre-installed). Test Managers Requirements unit supports integration with external requirements-management systems (RMS) through Test Managers open interface. Creating a plug-in and integrating it into Test Manager allows integration of any RMS into Test Manager. Refer to the Test Manager API Help for information about the interfaces that enable proper integration of external RMS. Borland CaliberRM, IBM Rational RequisitePro, and Telelogic DOORS requirements-management tools come pre-installed with Test Manager. You can set up Requirements Management (RM) integration for specific projects using the Integrations Configuration tab in the Settings unit. Additionally, you can upload requirements and tests from Caliber DefineIT to Test Manager without the need to configure any integrations within Test Manager. Please refer to the Caliber DefineIT documentation for detailed information. Note: For information on supported versions, please refer to the Release Notes. Related Concepts Requirements Integration Configuration External Requirements Management Tools Requirements Management Related Procedures Managing Requirements - Quick Start Task Integrating External RM Tools Copying CaliberRM-Integrated Projects Managing Requirements Related Reference Requirement Properties tab

78

Synchronizing Requirements
Enabling synchronization of requirements between Test Manager and an external requirements management (RMS) tool enables Test Manager to receive changes that occur in the external RMS system whenever a synchronization is executed. If a project has external RMS integration enabled, the master system for requirements is automatically the external system. This means that synchronization is always from the external RMS tool to Test Manager. Requirements can no longer be edited in Test Manager. An exception are newly created requirements that don't exist in the external tool, which are uploaded to the external (master) system only if the option Enable upload of requirements is enabled in Settings Integrations Configuration. Property mapping functionality allows you to map property fields between Test Manager and external requirement tools (for example, a custom field in Test Manager called User might be equivalent to a property field in CaliberRM called Field_2). The property mapping feature ensures that changes to requirement-property fields are accurately refreshed between projects. Requirements can be synchronized in one of several ways:

Manual synchronization: Available through button click on the Properties tab at the root folder level. Automatic scheduled synchronization: Based on globally defined Test Manager schedules. Automatic online synchronization: Changes to requirements are automatically propagated between tools. This
is available for CaliberRM. It requires CaliberRM client installation on the application server and MPX enabled. Requirement data is automatically updated in Test Manager when changes are made in CaliberRM and traces in CaliberRM are updated when test definition assignment changes are performed in Test Manager. This type of online synchronization is only available when projects are configured with the current baseline.

Automatic synchronization of requirements between Test Manager and external requirements management tools can be configured to occur based on global schedules. See SilkCentral Administration Module documentation for details on configuring global schedules. Note: The Open CaliberRM buttons open whatever program is registered as the default program for opening files of extension .crm. On some machines, this may be the requirement viewer, rather than CaliberRM. This behavior can be changed by your administrator. The client program is called caliberrm.exe. When properly configured, the program opens to the requirement that is selected in Test Manager. The binder icon on the project node of the Requirements tree indicates the status of RM integration for the project: No configuration - RM integration is not available. Manual configuration - Requirement import, upload, and synchronization can be done only by clicking the corresponding buttons on the project node in Requirements View (Properties tab). At the project level, the Properties tab includes the following properties:

Status - Whether or not integration has been enabled. Associated With - The external tool with which integration has been enabled. Project Name - The name of the external project that the Test Manager project is associated with. Requirement Types - The requirement types that are shared between projects.
Note: When integration between CaliberRM and Test Manager has been enabled, the project node displays the current status of the online requirements change listener. The three possible statuses for such projects are: Connected (synchronized), Reconnected (synchronization recommended), and Disconnected .

79

Related Concepts External Requirements Management Tools Requirements Management Related Procedures Managing Requirements - Quick Start Task Synchronizing Requirements Across Tools Managing Requirements Related Reference Requirement Properties tab

80

CaliberRM Integration with Test Manager


This section explains how to work with CaliberRM's integration with Test Manager. In This Section Baseline Support for CaliberRM Integration Test Manager's support for CaliberRM integration enables you to select the current baseline or existing userdefined baselines. Test Definition Assignment Handling How Test Manager test definition assignments are handled in CaliberRM.

81

Baseline Support for CaliberRM Integration


Test Manager's support for CaliberRM integration enables you to select the current baseline or existing user-defined baselines. When user-defined baselines are selected, the Map Requirement button is disabled (Requirements Properties) and requirements are not updated in the integrated CaliberRM project.

Modified Baselined Requirements Can Not be Imported into Test Manager


Requirements that are not of the current baseline can only be changed in CaliberRM if the version of the requirement that is used for the baseline is changed. Such changes are only updated within Test Manager requirements when a manual or scheduled synchronization is performed.

Baselines Can be Changed After Import into Test Manager


It is possible to change the configured baseline to a different user-defined baseline or the current baseline. After such a change, the next synchronization of the baseline (either manual or scheduled) will update the Test Manager project and update/create/delete requirements as required. When a baseline is changed, you will be presented with a message stating that the changes will take effect after the next synchronization. When a baseline is changed from the current baseline to a user-defined baseline, a message is displayed informing you that, for user defined baselines, upload of requirements is disabled.

Using the Same Requirement Type Within Different Projects


You can use the same requirement type of a CaliberRM project more then once when used with user defined baselines. There is however a restriction of allowing a requirement type of a CaliberRM project only once within the same Test Manager database when current baselines are in effect. Related Concepts Requirements Integration Configuration External Requirements Management Tools Requirements Management Related Procedures Managing Requirements - Quick Start Task Integrating External RM Tools Copying CaliberRM-Integrated Projects Managing Requirements Related Reference Requirement Properties tab

82

Test Definition Assignment Handling


Assigned test definitions are displayed, managed, and created as traces (Trace to) of synchronized requirements in CaliberRM. When synchronized requirements are assigned to test definitions in Test Manager, the test-definition assignments are pushed out to and displayed within CaliberRM. Conversely, when test definitions are assigned to synchronized requirements from within CaliberRM, the assignments are pushed out to and displayed within Test Manager. Related Concepts Requirements Integration Configuration External Requirements Management Tools Requirements Management Related Procedures Managing Requirements - Quick Start Task Integrating External RM Tools Managing Requirements Related Reference Requirement Properties tab

83

Filtering
This section explains how to filter requirements, test definitions or execution definitions in Test Manager. In This Section Filters Filters enable you to quickly sort through test plan elements and execution definitions, highlighting only those elements that are relevant to your needs. Recent Changes The Recent Changes filter enables you to efficiently view and acknowledge changes and additions that other users have made to requirements, test definitions, or execution definitions project-wide since your last change acknowledgement.

84

Filters
Filters enable you to quickly sort through test plan elements and execution definitions, highlighting only those elements that are relevant to your needs. Based on your needs, you can create new custom filters, edit existing filters, or turn filtering off at the project level. The toolbar includes buttons for creating filters, editing filters, deleting filters, and selecting existing filters. Projects do not contain any default filters. Note: Filters can be accessed and edited from the Test Manager tool bar and Settings unit (through the Settings link on the menu tree). Note: Filters are not applied to reports. The Recent Changes filter enables you to efficiently view and acknowledge changes and additions that other users have made to test definitions project-wide since your last change acknowledgement. In the Test Plan unit, two buttons at the far-right of the toolbar, the Show Changes/Show All toggle button and the Acknowledge button, help you to find out what changes other users have made. Your system administrator can configure email notifications that alert you to changes that are made to test definition settings. Email alerts include links that take you directly to a view of recent changes. Related Concepts Recent Changes Related Procedures Working with Filters Configuring Global Filters Filtering Test Runs on the Activities Page

85

Recent Changes
The Recent Changes filter enables you to efficiently view and acknowledge changes and additions that other users have made to requirements, test definitions, or execution definitions project-wide since your last change acknowledgement. The two buttons at the far-right of the toolbar, the Show Changes/Show All toggle button and the Acknowledge button, help you to find out what changes other users have made. Note: Your system administrator can configure email notifications that alert you to changes that are made to test definition settings. Email alerts include links that take you directly to a view of recent changes. Related Concepts Change-Notification Emails Filters Related Procedures Managing Requirements - Quick Start Task Managing Test Plans - Quick Start Task Viewing Recent Changes Related Reference Requirements Toolbar Functions Test Plan Toolbar Functions

86

Test Plan Management


This section explains how to manage test plans in Test Manager. In This Section Test Plan Management Test Managers Test Plan unit enables you to maintain control over test planning across the system development lifecycle. Test Plan Tree Test plans are displayed, organized, and maintained through a hierarchical tree structure known as the Test Plan tree. Test Plan Reports This section explains the test-plan reports that ship with SilkCentral Test Manager. Data-Driven Tests Data-driven tests are tests that are derived from values in an existing data source, such as a spreadsheet or a database. Success Conditions Success conditions are used to determine if a test is successful or if it has failed. Test Definition Parameters Parameters are freely configurable input values that can be assigned to different test types and used in a variety of ways. They help to define test definitions by defining test data. Test Packages Test packages mirror the structure of any third-party test type to Test Manager. Usage of External IDs External IDs are used to uniquely identify test methods in test packages. Manual Tests This section explains manual tests in Test Manager. SilkTest Test Plans SilkTest test plan files can be uploaded from SilkTest directly into Test Manager. Test Definitions This section explains certain test definition types and Test Manager's Upload Manager.

87

Test Plan Management


Test Managers Test Plan unit enables you to maintain control over test planning across the system development lifecycle. The Test Plan unit enables you to create and manage test plans, including the creation and scheduling of test definitions of both automated (SilkPerformer and SilkTest) tests and manual tests. Files and links can be uploaded and associated with test containers and definitions as attachments. When issues are encountered, they can easily be associated with the test definitions that led to their discovery. Full history of all changes to test plans are also tracked. Related Concepts Test Plan Management Managing Test Plans Related Procedures Managing Test Plans - Quick Start Task Related Reference Test Plan Unit Interface

88

Test Plan Tree


As with requirements, test plans are displayed, organized, and maintained via a hierarchical tree structure, the Test Plan tree. The Test Plan tree enables you to organize test definitions in any number of hierarchy levels. Each node in the tree represents either a test definition, a test folder, or a test container. Using the Contents tab (Test Plan Contents), you can view, cut, copy, and paste the child elements of any selected test plan element. Standard Windows Explorer style multi-select functionality is supported on the Contents tab. Note: When the Test Plan tree includes more elements than can be displayed at once without impacting response time, elements are displayed in increments. Page number links at the bottom of the tab allow you to browse through the elements included on the tab one page at a time. To display all elements as a single list, select the [All] link. Related Concepts Test Plan Management Managing Test Plans Related Procedures Managing Test Plans - Quick Start Task Finding and Replacing Test Definition Properties Related Reference Test Plan Unit Interface Test Plan Contents Tab Multi-Select Functionality for Test Plan Elements

89

Test Plan Reports


This section explains the test-plan reports that ship with SilkCentral Test Manager. Test Plan reports give you an overview of the progress of your test definitions and the status of defects over a period of time or over a range of builds. The following reports come pre-installed with Test Manager. In This Section Status Reports Here are the status reports that are available for Test Manager's Test Plan unit. Progress Reports Here are the progress reports that is available for Test Manager's Test Plan unit. Manual Test Reports Manual-test reports that are available for Test Manager's Test Plan unit.

90

Data-Driven Tests
Data-driven tests are tests that are derived from values in an existing data source, such as a spreadsheet or a database. Before you can work with data-driven tests, you need to configure a data source. See SilkCentral Administration Module documentation for details on data sources.

Single vs. Multiple Data-Driven Test Definition Instances


When planning data-driven tests, you should first be aware of the two different data-driven test types that are available in Test Manager:

Single data-driven test definition instance: A single test definition result is generated for all data rows of your
data source. This means that the test definition is only successful if the execution with every single data row is successful. If the execution with one data row fails, the whole test definition is marked as failed. definition of its own. This means that each data row produces a failed or passed test definition result. For example, if your data source is a spreadsheet with four rows, you will have the original test definition you created (a parent test definition) in addition to four new child definitions, one for each of the data rows.

Multiple data-driven test definition instance: Each data row of your data source is represented by a test

Note: The parent test definition created in this process does not have parameters associated with it, since it only represents a structuring instance for its child test definitions and no longer functions as an actual test definition. All values found in the data source will be listed on the parent test definitions Data Set tab. Note: When assigning a parent test definition to a requirement, note that links to requirements are only inherited when using single data-driven test definition instances. Note: You can not assign the parent test definition of a multiple data-driven test definition instance to a setup or cleanup test execution, as such a parent node is treated as a folder. You can assign one of its child nodes though, and you can also assign a single data-driven test definition instance to a setup or cleanup test execution.

Worksheet Handling
If your data source is a Microsoft Excel worksheet, you should follow these guidelines to ensure a successful and maintainable data-driven test definition setup:

Make sure that your column names are self-speaking. This will allow for easier maintenance in your data source
setup within Test Manager.

If you use multiple worksheets, make sure to use consistent column names across the worksheets. This will
make it easier for you to apply filters for selecting columns for your data source setup.

Keep in mind that youll want to use certain columns as key columns. Key columns will allow you to maintain
your data source file, while Test Manager is still able to identify specific data rows due to the value in the key column, despite changes in row orders. Values within a key column should be unique.

Data Import Considerations


When importing data rows from an external data source,Test Manager does not account for row sorting functionality used in the external data source. Due to this, the order of data rows in Test Manager may differ from the row order in the external data source. Test Manager also ignores any format settings that have been applied in the external

91

data source. For example, if you formatted date cells in an Excel worksheet to display the date in a certain way, Test Manager will ignore this setting and import any date values in the base format "YYYY.MM.DD HH:MM:SS.M". Related Concepts SilkTest Test Definitions Test Plan Management Managing Test Plans Related Procedures Managing Test Plans - Quick Start Task Working with Data-Driven Tests Related Reference Test Plan Data Set tab

92

Success Conditions
One or more success conditions can be assigned to each test node or suite node in the test plan of Test Manager. If a success condition is not met during the execution of a test definition it is assigned to, the test definition execution is marked "failed". For a test package, all success conditions except the execution time-out are disabled and hidden. The success conditions table shows the names of all success conditions that have been configured for a selected test definition. This table can be found at: Test Manager Test Plan Properties <Test Plan Tree Node> A success condition is only evaluated when it is active. To activate and deactivate success conditions see the related Editing Success Conditions procedure. The available types of success conditions differ depending on the test definition type. All currently available success conditions in Test Manager are listed below: Errors Allowed (Active by default) Warnings Allowed Execution Time-Out [s] Page Time: Avg. Page Time [s] Page Time: Max. Page Time [s] Transaction Response Time: Avg. Trans(Busy)ok [s] Maximal number of errors allowed for the test. Maximal number of warnings allowed for the test. Maximal time-out allowed for the test in seconds. Maximal allowed average time to load a page. Maximal allowed maximum time to load a page. Maximal allowed average response time for a transaction in the test. Transaction Response Time: Max. Trans(Busy)ok [s] Maximal allowed maximum response time for a transaction in the test. Inheritance of success conditions is similar to inheritance of properties. Success conditions that are assigned to a parent node are inherited throughout all sub-folders and child test definitions. Related Concepts Test Plan Properties tab Related Procedures Editing Success Conditions

93

Test Definition Parameters


Parameters are freely configurable input values that can be assigned to different test types and used in a variety of ways. They help to define test definitions by defining test data. For example, SilkPerformer test definitions use pre-defined parameters that represent the project attributes that are defined in a selected SilkPerformer test definition. For JUnit test definitions, any JUnit test class can access a custom parameter of the underlying test definition as a Java system property; the launcher passes these parameters to the executing virtual machine using the -D VM argument. For SilkTest test definitions, parameters serve as symbols within test data properties. You can also use parameters to parameterize input data for manual test steps. For all other test definition types, including custom test types (refer to the Test Manager API Help for details), the following applies: Note: Only custom parameters (freely defined by the user) are supported. How parameters are applied for each test definition type varies depending on how the test types (or the plug-ins for external test types) are implemented. The built-in external test types are currently only of relevance for JUnit. The JUnit test type passes all parameters to the executing Java VM as Java system properties.

Parameters within Parameter Values


You can use parameter values that contain parameters. The evaluation result of such parameter values is shown in a bold font in the GUI. Here is an example:
parameterA := aaa

parameterB := bbb + ${parameterA}

Evaluated values:
parameterA = aaa

parameterB = bbb + aaa

Parameter Notations
The following parameter notations are supported: For all test definitions:

${<parameter>}

All characters are allowed for parameter names, except $, {, }, and #. Deprecated notation for manual test definitions:

94

#<parameter>#

The following characters are allowed for parameter names: 0-9, a-z, A-Z, and _. Additional notation for SilkTest test definitions:

$<parameter>

The following characters are allowed for parameter names: 0-9, a-z, A-Z, and _.

Parameter-Token Replacement Upon Execution


Any string input for a property of a test definition may contain placeholders in the following form: $ {parametername}. parametername must match the name of a parameter defined or inherited for the test definition. At execution time, the placeholder is replaced by the value entered for the parameter with the denoted name. This makes recurring strings in properties more customizable and facilitates the editing of common definitions. When Test Manager finds a parameter with the notation ${<parameter>}, it first checks if the parameter is included in the defined parameters, and if not, it checks if the parameter is an environmental variable. Example: Value of a JUnit classpath property: junit.jar;${MyWorkingDir}/myclasses. Where there is a parameter MyWorkingDir with value C:/Temp/MyWorking, the resultant effective property value will be: junit.jar;C:/ Temp/MyWorking/myclasses. Note: The value of a parameter may also contain other parameter placeholders, which allows nesting based on the same principle. Related Concepts Test Plan Management Test Definitions in the Manual Testing Client Managing Test Plans Related Procedures Managing Test Plans - Quick Start Task Configuring Test Definition Parameters Related Reference Test Plan Parameters tab

95

Test Packages
Test packages provide support for the structure of third-party test types in SilkCentral Test Manager, and consist of a package root as well as an arbitrary hierarchy of suite nodes and test nodes. Test packages also provide users with detailed information about a test execution run. Test packages, suite nodes, and test nodes can be individually assigned, along with their issues and attachments, to requirements. This functionality is similar to the functionality of every other test definition. After a third-party test definition is converted into a test package, all tests contained in the package can be run individually. Test nodes and suite nodes contained in a test package are provided with an additional property, the External ID. An advantage of test packages is that the structure can be maintained automatically with every test execution. The structure of a test package can be updated according to the results of its runs. The file <Test Manager installation folder>\wwwroot\silkroot\xsl\output.xsd contains an XML schema for the structure of the output XML files of test packages. Test packages enable all functionalities of the individual tests, with the following exceptions:

Test containers that contain test packages cannot be linked. Test packages cannot be data-driven because they do not possess data-driven properties. All success conditions except the execution time-out are disabled and hidden for test package nodes.
Note: SilkPerformer test definitions, SilkTest test definitions, and manual test definitions can not be converted to test packages, as the structure of these tests is supported in Test Manager by default. Related Concepts Test Plan Management Managing Test Plans Usage of External IDs Related Procedures Managing Test Plans - Quick Start Task Finding and Replacing Test Definition Properties Editing Test Definitions Creating a Test Package Related Reference Test Plan Unit Interface Test Plan Contents Tab Multi-Select Functionality for Test Plan Elements

96

Usage of External IDs


External IDs are used to uniquely identify test nodes and suite nodes in test packages. An External ID is provided as a property for each test node and each suite node. The automatically generated External ID identifies a unique test method by the fully qualified name of the class and the method with an "~" prepended. For JUnit tests, the following schema is used for the automatically generated External ID: ~<package name>.<class name>#<method name>. When refactoring JUnit test classes, the automatic generation of the External ID is not applicable, because the result information of tests previous to the refactoring will be lost when creating a new test. In this case the External ID for the test must be manually defined. The refactored method is reidentifiable, because the External ID remains unchanged while moving a JUnit test or changing its name. The External ID can be manually set in the source code as an annotation. The following code example shows such an annotation for JUnit tests:
import java.lang.annotation.Retention; import java.lang.annotation.RetentionPolicy; @Retention(RetentionPolicy.RUNTIME) public @interface ExternalId { String externalId(); }

The annotation can be used in a JUnit test to annotate classes and test methods as shown:
import static org.junit.Assert.*; import org.junit.Test; import com.borland.runner.ExternalId; @ExternalId(externalId="JUnit4test") public class JUnit4test { @Test @ExternalId(externalId="MyExtId1") public void test1() { ... } @Test @ExternalId(externalId="MyExtId2") public void test2() { ... } }

Be aware that using External IDs with JUnit runner 'org.junit.runners.Parameterized' is not supported for test methods, because the External ID is not unique for repeated runs of a method with different parameters. As a work around an External ID could be specified on class level, but must be omitted on method level. An example follows:
@RunWith(Parameterized.class) @ExternalId(externalId="parameterizedWithExtId") public class TestCaseParameterizedWithExternalId { @Parameters public static Collection<Object[]> parameterFeeder() { return Arrays.asList(new Object[][] {

97

{ "param_name1", "param_value1" }, // set of parameters per run, type matching constructor must exist! { "param_name3", "param_value3" }, { "param_name2", "param_value2" }, } ); } private String paramName; private String paramValue; public TestCaseParameterizedWithExternalId(String paramName, String paramValue) { this.paramName = paramName; this.paramValue = paramValue; } @Test public void testWithParams() { System.out.println(String.format("run with parameter: name='%s', value='%s'", paramName, paramValue)); } }

Note: The setting of the External ID for a JUnit test is only possible for tests using JUnit 4.4 or higher. Related Concepts Test Plan Management Managing Test Plans Test Packages Related Procedures Managing Test Plans - Quick Start Task Finding and Replacing Test Definition Properties Editing Test Definitions Creating a Test Package Related Reference Test Plan Unit Interface Test Plan Contents Tab Multi-Select Functionality for Test Plan Elements

98

Manual Tests
This section explains manual tests in Test Manager. In This Section Converting Manual Tests to Automated Tests Convert a manual test definition to an automated test of any of the supported automated test types. Using External Tools to Create Manual Tests Test Managers open interface allows you to create manual test definitions outside of Test Managers user interface. Test Definitions in the Manual Testing Client While in Edit mode, the SilkCentral Manual Testing Client offers a full range of test definition editing functionality, including the addition, reordering, and removal of test steps and the insertion of custom step properties.

99

Converting Manual Tests to Automated Tests


You can convert a manual to an automated test of any of the supported automated test types, SilkPerformer, SilkTest, NUnit, and JUnit, and all installed plug-ins. The process carries manual test parameters over to the automated test, and adds automated parameters to the new automated test definition. The manual test parameters that are carried over to automated test definitions are:

Name Description Assigned requirements Assigned execution definitions Assigned issues Attachments Test steps
Related Concepts Test Plan Management Managing Test Plans Related Procedures Managing Test Plans - Quick Start Task Converting Manual Test Definitions to Automated Tests Related Reference Test Plan Unit Interface

100

Using External Tools to Create Manual Tests


Test Managers open interface allows you to create manual test definitions outside of Test Managers user interface. You can create your own solutions and automatically create manual test definitions by usingTest Managers Web Service calls. The following calls in Test Managers tmplanning Web Service assist you in creating manual test definitions:

getTestContainers addManualTest getCustomStepPropertyNames getChildNodes getNodeDetails addNode updateNode startExecution


For a detailed explanation of these Web Service calls, please refer to the Test Manager API Help. Related Concepts Test Plan Management Managing Test Plans Related Procedures Managing Test Plans - Quick Start Task Related Reference Test Plan Unit Interface

101

Test Definitions in the Manual Testing Client


While in Edit mode, the SilkCentral Manual Testing Client offers a full range of test definition editing functionality, including the addition, reordering, and removal of test steps and the insertion of custom step properties. Manual test definition properties can be edited in both online and offline modes. Changes made in offline mode can be synchronized with the server whenever an Internet connection is available. Edit mode enables editing of all test definition elements that can be edited through the Test Manager Web client. On the Details tab this includes the following values: Planned Time, Step Names, Custom Step Property Values, Step Description, and Expected Result. On the Description tab the Test Definition Description field can be edited, including the insertion of custom step property parameters. In Edit mode, multiple test steps can be selected within the Test Definitions window using standard Windows keyboard shortcuts. To apply a status change to selected test steps, right-click the selection and select a new status value. In normal mode, you can only enter test results into the Result column and edit the status of test definitions and individual test steps. Statuses are changed by right-clicking status values and selecting an alternative status value. Note: Editing of data-driven test definitions is not supported.

Parameters
Custom step property parameters can be inserted into test definition and test step descriptions. Parameters can be inserted into the Test Definition Description, Step Description, and Expected Result fields. In normal mode, parameter values are resolved (their parsed values are displayed in place of the parameters themselves). In Edit mode parameters are not resolved; the parameters themselves are displayed. When in Edit mode, using the Parameters list box on the Description tab toolbar, you can select preconfigured Test Manager parameters for insertion.

Change Conflict Handling


With the Manual Testing Client's offline editing functionality it's possible for multiple users to edit the same test definition at the same time. Test Manager automatically merges all uploaded changes into the Test Plan tree unless change conflicts arise (for example, if two users simultaneously edit the same manual test step). If your uploaded changes conflict with recent changes made by another user, upon upload, you will be presented with the Test Definition Conflicts dialog box so that you can manage the conflict. Using the Test Definition Conflicts dialog box you can specify if your changes should be saved as a part of the test definition and included in future runs of the test definition, or if they should be ignored. If you opt to have your changes ignored, they will still remain a part of the test results from the test run in which they were included.

102

Related Concepts Test Definition Parameters Manual Test Definitions Test Definition Execution Tour of the Manual Testing Client UI Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Editing Test Definitions Within the Manual Testing Client Executing Test Definitions

103

SilkTest Test Plans


SilkTest test plan files can be uploaded from SilkTest directly into Test Manager. Exported Test Plan trees are then displayed in Test Manager tree view. See SilkTest documentation for full details on exporting test plan files. In This Section SilkTest Test Definitions Unique characteristics of SilkTest test definitions.

104

SilkTest Test Definitions


Observing running SilkTest executions on the Activities page (Projects unit, Activities tab), the currently running execution definition offers a hyperlink that opens a Details View. This view allows you to closely monitor the state of the currently running execution definition. For SilkTest test definition executions, the center component of this view consists of two parts: The upper part shows general information about the test definition, script, test case, and test data. The lower part shows all output messages generated by SilkTest (along with their severity). Test Managers SilkTest interface offers a reliable means of automating SilkTest users. Each test case of a SilkTest script executes within its own test definition execution and produces its own results. In previous versions of Test Manager, SilkTest invocation was implemented through a commandline interface. The new interface works using interprocess communication. You can specify whether or not Test Managers SilkTest interface should be used by configuring test-container settings. Defining SilkTest Test Properties. For all test parameters that are not defined through SilkTest test properties in the Test Manager GUI, SilkTest default settings are used (for example, from partner.ini). The following SilkTest test properties may be defined:

Test script - The test script (.t, .g.t) is defined relative to the test containers root node in the source control
profile. This setting is required for all SilkTest test definitions. as data driven, the test case is required.

Testcase - The test case can be selected from a list box or entered manually. If the test definition is not defined
If the custom test case field is already populated, the SilkTest test definition was automatically created (using the export functionality within SilkTest). If the custom field is used for specifying the test case, the test case name can be terminated by parenthesis "()". In between the parenthesis, test data may be specified (defined test data can also include parameters). Please note that this will override the values of the Test data property (see below).

Test data - Specification of test data is optional. If several arguments are passed to SilkTest, they have to be
separated by a comma (,). If a String argument is passed to SilkTest, the argument must be set in quotation marks ().When test data is more complex, its recommended that you use parameters in the test data, e.g., $ {ParameterName}. Parameters are replaced automatically within test definition executions.

Data driven - When a SilkTest test requires input data from an external datasource, this flag must be enabled.
Default execution mode for data-driven tests is plan-based. If script-based execution mode is to be used for a data driven test, change the DataDrivenScriptMode setting in the SilkTest element of SccExecServerBootConf.xml. option set files. To specify an option set file, specify the file name relative to the test containers root node in the source control profile.

Option set - Specification of an option set file is optional. By default, Test Manager closes all open SilkTest

105

Related Concepts Test Plan Management Managing Test Plans Related Procedures Managing Test Plans - Quick Start Task Creating Test Definitions Editing SilkPerformer Tests Adding Test Containers Related Reference Test Plan Unit Interface

106

Test Definitions
This section explains certain test definition types and Test Manager's Upload Manager. In This Section Upload Manager SilkCentrals Upload Manager offers a convenient means of uploading files (typically test scripts) to the SilkCentral file pool. Windows Script Host Tests Windows Script Host (WSH) is part of the Windows platform and creates an environment for hosting scripts.

107

Upload Manager
SilkCentrals Upload Manager offers a convenient means of uploading files (typically test scripts) to the SilkCentral file pool where they are accessible to Test Manager. For information about uploading files to Issue Manager, please see the SilkCentral Issue Manager Help. With Issue Managers integration with TechSmiths SnagIt screen capture utility, Issue Manager users can easily capture screen images of error conditions and upload them to Issue Manager where they can be attached to existing issues or serve as the basis for new issues. Upload Manager can be accessed in one of two ways:

By starting its executable from the program directory to which it has been installed (for example, C:\Program
Files\Borland\SC Test Manager <version>\Upload Manager\UploadManager.exe).

(Issue Manager users only) Using a hotkey keyboard combination (for example, Ctrl+Print Screen). This
option is enabled when the SnagIt screen capture utility is configured for use with Issue Manager. See Issue Manager documentation for details.

Note: For information about uploading files to Issue Manager, please see Issue Manager documentation.

Commandline Options
Upload Managers settings can be defined through command line options. Defining command line options optimizes SnagIts integration with SilkCentral by automating formatting and configuration operations that you would otherwise have to execute manually with each file upload. Note: Settings configured through command line options override any conflicting settings that may have been manually configured using the SnagIt GUI. Here is the format that is to be followed when constructing commandline options: UploadManager.exe {options} {UploadFileNameList} The order in which option settings are listed does not affect application behavior. Option parsing is not case-sensitive. Option parameters must not be separated by blank spaces. The following command line options are supported by Upload Manager: -HOSTNAME:<HostName> For example, -HOSTNAME:tm.borland.com The target server. Notice that the server name is not preceded by a protocol. For example, -USERNAME:admin The user name to be used for server login. For example, -PASSWORD:secret The password to be used for server login. For example, -PORT:19120 The targeted server port. is . The default http:// port is 80. The default https:// port is 443. The default test server port is 19120. For example, -SECURE:1 For https://myhost:myport/ connections: - 0 sets a standard HTTP connection. - 1 sets a secure HTTPS connection.

-USERNAME:<UserName> -PASSWORD:<Password> -PORT:<PortNo>

-SECURE:<0 or 1>

108

-CONFIG: <ConfigurationId>

For example, CONFIG:2 The configuration ID of the targeted server, used for uploading files to Issue Manager and SilkCentral: - 2 is used for uploading files to Issue Manager.

-PROJECT:<ProjectId>

- 3 is used for uploading files to the SilkCentral server file pool. For example, -PROJECT:0 0 is the default demo project in Issue Manager. To identify an Issue Manager projects ID#, pass your cursor over an active project name on the Issue Manager Projects page. Then look for the following string in your browsers status bar: imPrj=<project ID #>. For example, DEFID:156

-DEFID:<DefectId>

ID of a specific issue in Issue Manager to which a file is to be attached (in this case, issue #156). 0 is used to create a new issue. -DESC:<AttachmentFileDescription> For example, -DESC:Screen Shot Attachment Description of the attached file (Issue Manager only). Checks the Upload Manager dialog check box that instructs Upload Manager to close after it successfully completes an upload. -AUTOUPLOAD:<WizardStepIndex> For example, -AUTOUPLOAD:10 -AUTOCLOSE Instructs Upload Manager to attempt automatic file upload up through a specified wizard step (WizardStepIndex). Upload Manager then prompts the user for manual input at the following wizard step. Automatic file upload will stop at any step within which a configuration failure is detected. This option causes a dialog box to be displayed if a dependent component is missing (for example, if a SilkPerformer compiler is unavailable when a SilkPerformer project is to be uploaded to SilkCentral). For example, C:\TEMP\MyScreenShot.jpg Paths can be used when uploading multiple files to the SilkCentral file pool. Any parameter passed to Upload Manager that is not preceded by a dash (-) is recognized as an absolute file path.

-VERBOSE

<UploadFileNameList>

Related Concepts Test Plan Management Managing Test Plans Related Procedures Managing Test Plans - Quick Start Task Using Upload Manager Related Reference Test Plan Unit Interface

109

Windows Script Host Tests


Windows Script Host (WSH) is part of the Windows platform and creates an environment for hosting scripts. That is, when a script is to be run at the execution server, WSH plays the role of host. It makes objects and services available for the script and provides a set of guidelines within which the script is executed. Among other things, Windows Script Host manages security and invokes the appropriate script engine. Here are some online WSH resources that you may find valuable: WSH Resources: http://labmice.techtarget.com/scripting/WSH.htm , http://www.winguides.com/links.php? guide=scripting Downloadable Script Interpreters: http://aspn.activestate.com/ASPN/

Supported Script Languages


WSH is language-independent for WSH-compliant scripting engines. Natively, the Windows platform supports Visual Basic Scripts (file extension .vbs) and scripts written in the Java Script language (file extension .js). For other scripting languages, a dedicated script interpreter must be installed on the execution server. For example, if a customer installs a Perl interpreter on an execution server, this will register a Perl scripting engine at the WSH environment for the extension .pls. Whenever a file with extension .pls is passed to the WSH tool (cscript.exe), it will invoke the appropriate interpreter because of the file extension. So the client of WSH (in this case the SilkCentral Execution Server) does not need to know about the installation of the Perl interpreter. Note: After installing a script interpreter (e.g., Active Perl), before executing a script in Test Manager, try to execute the script locally on the execution server by calling the WSH command line tool with a sample script. To do so, open a command shell on the execution server and enter:

cscript <somescript> ...where <somescript> is the path to a script of your choice that is available on your execution server.

This is exactly what SilkCentral Test Manager will call when executing a WSH test definition on an execution server. If the script is executed, then the scripting engine has been registered successfully. These are the scripting languages that are known to be WSH compatible:

Perl (extension .pls) Python (extensions .py, .pyw) REXX TCL (extension .tcl)

WSH Test Properties


For the Script property, you can define any file where a script engine is registered for the script language the file contains. Script files under source control are deployed automatically to execution servers (comparable to test sources for other test definition types).

110

Switches
For the Switches test definition property, the following settings can be entered and passed to cscript.exe during test definition execution: //B Batch mode suppresses all non-command-line console UI requests from the script. It is recommended that you use this option to prevent a script from waiting for user input during unattended executions at the execution server. Unicode is used for redirected I/O from the console (recommended). Time-out, in seconds. Maximum time the script can run (default = no limit). This option is used to prevent excessive execution of scripts; it sets a timer. When execution time exceeds the specified value, Cscript interrupts the script engine using the IActiveScript::InterruptThread method and terminates the process.

//U //T:nn

There is a callback hook. If the time-out is invoked, the OnTimeOut function is called to permit cleanup. Although it is possible to create infinite loops using this feature, it is more useful than harmful. //logo Displays an execution banner at execution time that is visible at the beginning of the log.txt log file. This is the default setting. //nologo Prevents display of the execution banner at execution time. //D Enables active debugging. //E:engine Use the engine to execute a script. //Job:xxxx Execute a WSF job. //X Execute the script in debugger.

Parameter Usage in WSH Tests


Parameters that are defined for a WSH test definition automatically add a name/value pair to the command line as an additional argument and set the parameters as environment variables for the called process. This functionality allows you to access all parameters defined for your test definition within the WSH script. Example: A WSH test definition is defined with myscript.js as script and //B as switch. The test requires a parameter called IPAddress with the value 192.168.1.5 and another parameter called Port with the value 1492. The resulting command line for the WSH execution in this example is:
csript myscript.js //B IPAddress=192.168.1.5 Port=1492

Returning Success Information


To collect results of a WSH execution, the WSH script must generate a file called output.xml in the current working directory of the WSH test. All files residing in this directory are stored in the database and are downloadable through the list of files for the test definition execution. Files are excluded from storage when their extensions are defined under the File extensions to ignore in results property in Test Managers Projects unit (accessible through the Test Manager GUI). Note: The current working directory is dynamically created for each WSH execution. Do not use an absolute path when creating the file. Any relative path used will correctly refer to the current working directory.

111

Log Information
Any information that a script writes to the WSH standard output goes into the log.txt text file that resides in the current working directory. This file is stored in the database and can be viewed as it is included in the file list of the test definition execution. Example for printing log information:

WScript.Echo "This info will be written to the log.txt file"

Structure of Output.xml
The XML structure begins with an element ResultElement that defines an attribute named TestItem, which specifies the name of the ResultElement. The ResultElement must contain an element named ErrorCount, optionally an element named WarningCount, and a list of Incident elements. The ErrorCount and WarningCount elements must contain a positive number or zero. In SilkCentral Test Manager, the ErrorCount and WarningCount of the top-level* ResultElement are used for evaluating success conditions, which determines if a test has passed or failed. The Incident element represents an event that happened during the execution of the WSH test. Message and Severity are shown in the messages list of test definition executions in SilkCentral Test Managers GUI. An Incident element must contain a Message and a Severity element. The Severity element must hold one of following values:

Info Warning Error (or Exception) Failure


Note: Up through SilkCentral Test Manager 8.1, the value of the Message element had to be URL encoded (ISO-8859-1). Since version 8.1.1, URL encoding is no longer allowed. *The XML file may contain additional elements that are not visible in the SilkCentral Test Manager GUI. The output.xml file is however stored in the database and is viewable as it is included in the file list of the executed test definition.

Storing Additional Information in the Result File


The ResultElement may contain any number of sub-ResultElements. So information can be easily grouped. Sub-ResultElements make the result file easier to read. For compatibility reasons related to unit tests (JUnit, NUnit), ResultElement may be named TestSuite or Test. The ResultElement may contain additional elements:

FailureCount (treated the same way as error count) RunCount (if a test is run multiple times) Timer (e.g., for duration of the test)

112

WasSuccess (compatibility with NUnit result files) Asserts (compatibility with NUnit result files)
The Incident element may contain a list of Detail elements. The Detail element represents detailed information about an Incident. It must define a TestName element and an Info element. The TestName is used to give detailed information about where the Incident happened. The Info element holds detailed information about the Incident (e.g., a stack trace). Note: Up through SilkCentral Test Manager 8.1, the value of the Info element had to be URL encoded (ISO-8859-1). Since version 8.1.1, URL encoding is no longer allowed.

Sample Result File


<ResultElement TestItem="WshOutputTest"> <ErrorCount>1</ErrorCount> <WarningCount>1</WarningCount> <Incident> <Message>some unexpected result</Message> <Severity>Error</Severity> <Detail> <TestName>function main()</TestName> <Info>some additional info; eg. stacktrace</Info> </Detail> </Incident> <Incident> <Message>some warning message</Message> <Severity>Warning</Severity> <Detail> <TestName>function main()</TestName> <Info>some additional info; eg. stacktrace</Info> </Detail> </Incident> </ResultElement>

Java Script Sample


The following script was used to generate the sample result file shown above. To try this script save it with the extension .js.

function dumpOutput(dumpFile) { dumpFile.WriteLine("<ResultElement TestItem=\"WshOutputTest\">"); dumpFile.WriteLine(" <ErrorCount>1</ErrorCount>"); dumpFile.WriteLine(" <WarningCount>1</WarningCount>"); dumpFile.WriteLine(" <Incident>"); dumpFile.WriteLine(" <Message>some unexpected result</Message>"); dumpFile.WriteLine(" <Severity>Error</Severity>"); dumpFile.WriteLine(" <Detail>"); dumpFile.WriteLine(" <TestName>function main()</TestName>"); dumpFile.WriteLine(" <Info>some additional info; eg. stacktrace</Info>"); dumpFile.WriteLine(" </Detail>");

113

dumpFile.WriteLine(" </Incident>"); dumpFile.WriteLine(" <Incident>"); dumpFile.WriteLine(" <Message>some warning message</Message>"); dumpFile.WriteLine(" <Severity>Warning</Severity>"); dumpFile.WriteLine(" <Detail>"); dumpFile.WriteLine(" <TestName>function main()</TestName>"); dumpFile.WriteLine(" <Info>some additional info; eg. stacktrace</Info>"); dumpFile.WriteLine(" </Detail>"); dumpFile.WriteLine(" </Incident>"); dumpFile.WriteLine("</ResultElement>"); } function main() { var outFile; var fso; fso = WScript.CreateObject("Scripting.FileSystemObject"); outFile = fso.CreateTextFile("output.xml", true, true); outFile.WriteLine("<?xml version=\"1.0\" encoding=\"UTF-16\"?>"); dumpOutput(outFile); outFile.Close(); WScript.Echo("Test is completed"); } main(); WScript.Quit(0);

Visual Basic Script Sample


The following script generates an Output.xml result file as shown in "Sample Result File". To try this script save it with the extension .vbs.

WScript.Echo "starting" Dim outFile Dim errCnt Dim warningCnt outFile = "output.xml" errCnt = 1 ' retrieve that from your test results warningCnt = 1 ' retrieve that from your test results Set FSO = CreateObject("Scripting.FileSystemObject") Set oTX = FSO.OpenTextFile(outFile, 2, True, -1) ' args: file, 8=append/2=overwrite, create, ASCII oTX.WriteLine("<?xml version=""1.0"" encoding=""UTF-16""?>") oTX.WriteLine("<ResultElement TestItem=""PerlTest"">") oTX.WriteLine(" <ErrorCount>" & errCnt & "</ErrorCount>") oTX.WriteLine(" <WarningCount>" & warningCnt & "</WarningCount>") oTX.WriteLine(" <Incident>") oTX.WriteLine(" <Message>some unexpected result</Message>") oTX.WriteLine(" <Severity>Error</Severity>") oTX.WriteLine(" <Detail>") oTX.WriteLine(" <TestName>function main()</TestName>") oTX.WriteLine(" <Info>some additional info; eg. stacktrace</Info>")

114

oTX.WriteLine(" </Detail>") oTX.WriteLine(" </Incident>") oTX.WriteLine(" <Incident>") oTX.WriteLine(" <Message>some warning message</Message>") oTX.WriteLine(" <Severity>Warning</Severity>") oTX.WriteLine(" <Detail>") oTX.WriteLine(" <TestName>function main()</TestName>") oTX.WriteLine(" <Info>some additional info; eg. stacktrace</Info>") oTX.WriteLine(" </Detail>") oTX.WriteLine(" </Incident>") oTX.WriteLine("</ResultElement>")

Related Concepts Test Plan Management Test Definition Parameters Managing Test Plans Related Procedures Managing Test Plans - Quick Start Task Creating Test Definitions Editing Windows Scripting Host Tests Adding Test Containers Related Reference Test Plan Unit Interface

115

Test Definition Execution


This section explains how to manage execution definitions, including assigning test definitions, scheduling test runs, setting up dependencies, configuring dynamic hardware-provisioning with keywords, and configuring a deployment environment. Test Managers Execution unit enables you to maintain control over test executions during development and testing. The Execution unit enables you to configure execution definitions, schedule execution definitions, assign test definitions to execution definitions, set up execution-definition dependencies, configure execution-server deployment, and configure dynamic hardware-provisioning with keywords. Executions are displayed, organized, and maintained through a hierarchical tree structure in the Execution tree. Each execution may have any number of child test definitions associated with it. The Execution tree enables you to organize executions within folders, in any number of hierarchy levels. In This Section VMware Lab Manager Integration This section explains Test Manager's integration with VMware Lab Manager. Execution Dependency Configuration Enables you to configure the automatic execution of one execution definition based on the results of another execution definition. Execution Definitions Execution definitions are collections of assigned test definitions that are stored in a single test container. Execution Definition Run Results Dialog Lists run details of an execution definition. Execution Definition Schedules You can define the schedules by which execution definitions are executed. Setup and Cleanup Test Definitions Setup test definitions prepare testing environments in anticipation of tests. Cleanup test definitions restore test environments to their original state following tests. Calculating the Test Definition Status Describes how the status of a test definition is calculated from the statuses of the test steps included in the test definition. Manual Test Definitions This section explains the execution of manual tests in Test Manager SilkTest Tests This section explains how to execute test definitions in SilkTest.

116

VMware Lab Manager Integration


This section explains Test Manager's integration with VMware Lab Manager. In This Section VMware Lab Manager Virtual Configurations VMware Lab Manager configurations offer an effective means of virtualizing complex software-testing lab environments.

117

VMware Lab Manager Virtual Configurations


VMware images are virtual computer systems. VMware Lab Manager is used to manage VMware images, or "configurations", which are combinations of images (for example, database server, application server, and execution server). VMware configurations offer an effective means of virtualizing complex software-testing lab environments. Configurations are typically deployed from VMware Lab Manager libraries. Configurations are turned on and off just like individual VMware images. Multiple instances of the same configuration can be deployed simultaneously, with separate tests run in each instance. VMware configurations are network-fenced, meaning that they do not influence each others' network behavior. VMware LiveLink technology enables you to take snapshots of complete configurations that can later be recreated (or restored) on demand. VMware Lab Manager's integration with SilkCentral Test Manager enables users to manage VMware Lab Manager directly from Test Manager's UI. Integrated functionality includes configuration deployment, test execution, result collection, and automatic undeployment of configurations. Test Manager can support multiple VMware Lab Manager installations and configurations. Configurations captured using LiveLink technology are viewed using VMware Lab Manager. Note: See VMware Lab Manager documentation for full details regarding LiveLink configuration captures and other VMware Lab Manager functionality. Note: At least one Test Manager execution server must exist within each configuration. These execution server instances control test execution within configurations and retrieve test results. Note: See SilkCentral Administration Module documentation for details on configuring Test Manager's integration with VMware Lab Manager. Note: VMware Lab Manager users must have administrator rights to access the VMware Lab Manager API. Related Concepts Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Executing Individual Tests Configuring Deployment Environments Executing Test Definitions Related Reference Execution Deployment tab Activities Page

118

Execution Dependency Configuration


An execution dependency allows you to configure the automatic execution of one execution definition based on the results of another execution definition (for example, If execution definition 'A' fails, automatically execute execution definition 'B'). Related Concepts Test Definition Execution Execution Definition Schedules Related Procedures Managing Test Executions - Quick Start Task Adding Dependent Execution Definitions Executing Test Definitions Related Reference Execution Dependencies tab

119

Execution Definitions
Execution definitions are collections of assigned test definitions that are stored in a single test container. Execution definitions can be run at configurable schedules and deployed on specified execution servers. The process of adding and editing execution definitions is the same for both automated execution definitions and manual execution definitions.

Dynamic Hardware Provisioning with Keywords


Test Manager's hardware-provisioning technology helps you manage test environments that include numerous execution servers. Rather than having to configure a one-to-one static execution-server assignment for each automated execution definition, keywords enable Test Manager to automatically select the most appropriate execution server for each execution definition. This is done through dynamic comparison of each execution definition's keyword list with the keyword lists of all active execution servers. Keywords typically describe your execution environment requirements (for example, platform, operating system, and pre-installed applications). There are different uses for keywords, depending on whether the execution definition is executed automatically or manually. Automated execution definitions When an automated execution definition is executed, Test Manager compares the execution definition's keywords with the keywords of all available execution servers. The execution is then run on the first-identified execution server that has a matching keyword list. Manual execution definitions For manual execution definitions, the manual tester can reflect the test environment by using keywords. If you require an automated execution definition to be run on multiple execution servers, create a copy of the execution definition and assign additional keywords to the execution definition that match other execution servers.

Reserved Default Keywords


If you do not require hardware provisioning to execute automated execution definitions, you can use the reserved keywords that are created automatically for each execution server. In such cases, it is not necessary that you manually assign keywords to your execution servers. Instead, you can configure a one-to-one static execution-server assignment for each execution definition. A reserved keyword is assigned automatically to each newly created execution server. Reserved keywords are structured in the following form: #<execution server name>@<location name>. Reserved keywords are only available when assigning keywords to execution definitions. They are neither available or applicable when assigning keywords to execution servers. In addition to the reserved keywords that are set up automatically for each defined execution server, reserved keywords are also set up for each execution server type: #PHYSICAL Limits execution-server provisioning to physical execution servers. #VIRTUAL Limits execution-server provisioning to virtual execution servers.

Keywords and Virtual Execution Servers


Keywords are assigned to virtual execution servers in the same way that they are assigned to physical execution servers. When you configure at least one virtual execution server, the #VIRTUAL keyword is dynamically created and made available for assignment to all execution definitions. If you prefer that an execution occur on a virtual machine, select the #VIRTUAL keyword for the execution definition. When an execution definition has neither the #VIRTUAL and #PHYSICAL keywords, the execution may occur on either a virtual or a physical execution server, 120

assuming the settings of the execution environments are the same. When an execution-definition's keywords match multiple virtual execution servers, the first matching virtual execution server that is identified is selected.

Folder Execution
Execution definitions can be combined into execution folders, where a folder can include execution subfolders and execution definitions. The options for an execution definition execution are also available for an execution folder execution. When executing a folder, the contained subfolders and execution definitions are treated as follows:
The Relation of Execution and Keywords
Keywords of Executed Folder Keywords of Contained Execution Definition/Subfolder Execution of Contained Execution Definition/ Subfolder

Has no keywords Has no keywords Has keywords Has keywords

Has no keywords Has keywords Has no keywords Has keywords

Execution definitions without keywords obtain status NOT EXECUTED after execution Execution servers are assigned based on the execution definition/subfolder keywords Execution servers are assigned based on the folder keywords Execution servers are assigned based on the folder keywords

Note: When a folder is executed manually and there are no keywords assigned, or no execution server exists for the assigned keywords, the default execution server is used for execution. If the default execution server is not available, these execution definitions are marked as "Not Executed".

Upgrading Execution Definitions


Automated execution definitions created with Test Manager 2008 or lower could be assigned to multiple execution servers. With the new dynamic hardware provisioning, introduced in Test Manager 2008 R2, automated execution definitions are assigned to one physical execution server only. If you upgrade to SilkCentral Test Manager 2008 R2, all automated execution definitions that previously had multiple execution servers assigned will be assigned to one of the previously defined execution servers. Modified automated execution definitions will be marked with an exclamation mark (!) in front of them in the Execution tree. Additionally, a log file is generated on the application server, listing the exact changes to each of the modified automated execution definitions. The generated log file (dbupgrade6001.xml) resides on the application server in the Documents and Settings\<username> \Application Data\Borland\SCC35\log\ folder. If the application server runs under the system user, the default path is Documents and Settings\All Users\Application Data\Borland\SCC35\log\.

Test Status Calculation


Each execution definition has one of the following status conditions: Passed Failed All considered test definition executions have the status Passed. At least one considered test definition execution has the status Failed, but none of the definition executions has the status Not Executed. Not Executed At least one considered test definition has the status Not Executed.

A test definition gets its status from the result of the latest test execution run. If you manually change the status of the latest test execution run, the test definition status changes also.

121

Note: If the latest test execution run is deleted, the status of the test definition resets to the status of the latest existing test execution run. Only test execution runs with status Passed or Failed are used to reset the test definition status, test execution runs with status Not Executed are ignored. Note: If the deleted test execution run was the only existing test execution run, the status of the test definition is set to Not Scheduled, as if the test definition was newly created. Related Concepts Test Definition Execution Execution Definition Schedules Related Procedures Managing Test Executions - Quick Start Task Adding Execution Definitions Assigning Keywords to Execution Definitions Executing Test Definitions Related Reference Execution Unit Interface

122

Execution Definition Run Results Dialog


The Execution Definition Run Results dialog lists run details of an execution definition. The dialog can be accessed from the following location:Test Manager Projects Activities Last Executions Run ID. You can also access the Execution Definition Run Results from Test Manager Execution. Select the execution definition for which you want to see details, click the Runs tab, right click on the run and choose View details. The Execution Definition Run Results dialog shows detailed information about the following items:
Item Description

Execution Definition Name Execution Definition ID Execution Definition Run ID Start Time Duration Execution Server Warnings/Errors Status Version/Build SilkTest AUT Host Name Setup Test Definition

Cleanup Test Definition

Name of the execution definition. Unique identifier of the execution definition. Identifier of the excution definition run. Time the run was started. Duration of the run. Execution server assigned to the execution definition. Amount of warnings and errors generated during the run. Status of the execution definition after the run. Version and build of the product specified for the run. Name of the SilkTest AUT (Application Under Test) Host. Test definition that prepared the testing environment in anticipation of the test. Click on the name of the test definition to view or edit it. Click on the ID of the test definition run to open the Test Definition Run Results dialog box. Test definition that restored the testing environment to its original state following the test. Click on the name of the test definition to view or edit it. Click on the ID of the test definition run to open the Test Definition Run Results dialog box.

The Execution Definition Run Results dialog provides additional information about the files included and the messages generated during the execution definition run. It also lists all the assigned test definitions for the execution definition. For manual tests click Manual Test Results to get a read-only version of the current runs page, with detailed information on the manual test. Uncheck the Hide passed test definition runs check box to show all test definitions. The Hide passed test definition runs check box is checked by default to show only the not passed test definitions. The Assigned Test Definitions section lists all test definitions that are assigned to this execution definition. Click on the name of a test definition to view or edit it, or click on the Run ID of a test definition to open the Test Definition Run Results dialog box. Related Concepts Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Creating Test Definitions Working with Data-Driven Tests Executing Test Definitions Related Reference Test Definition Run Results Dialog Execution Runs Tab Current Run Page

123

Execution Definition Schedules


Once you have defined the test definitions that are to be included in an execution definition you can define the schedule by which the execution definition is to be executed. This is done using the Schedule tab (Execution unit/ Execution View). Three scheduling options are available:

No schedule (None) Use a pre-defined schedule (Global) Define a custom schedule (Custom)
Note: Schedules can be defined for entire folders as well as individual execution definitions. If a schedule is defined for a folder, all execution definitions that are included in the selected folder will be executed at the specified schedule. Execution definitions with no keywords assigned get the status "Not Executed" when executed in a schedule.

Schedule Exclusions
Exclusions enable you to define weekdays and time-of-day intervals during which test definitions are not to be executed, regardless of configured schedules. For example, you may not want tests to be executed on weekends.

Definite Runs
Definite runs enable you to define times at which test definitions will be executed regardless of configured schedules. Related Concepts Test Definition Execution Execution Definitions Related Procedures Managing Test Executions - Quick Start Task Creating a Custom Schedule for an Execution Definition Adding Definite Runs Adding Exclusions Adding Execution Definitions Executing Test Definitions Related Reference Execution Schedule tab

124

Setup and Cleanup Test Definitions


Test Manager's pre-test setup and post-test cleanup functionality enables you to define a setup test definition and a cleanup test definition for each execution definition. Setup test definitions are typically built upon scripts or manual procedures that prepare testing environments in anticipation of tests. Cleanup test definitions typically include scripts or manual procedures that restore test environments to their original state following tests. You must create your setup and cleanup test definitions before you can assign them to execution definitions. Any test definition can serve as a setup or cleanup test definition, except parent test definitions of multiple data-driven test definition instances. They require no special configuration and can be either automated or manual. The only requirement is that they perform the required setup and cleanup processes within your test environment. In the case of automated test definitions these are scripts that perform required setup/cleanup tasks. In the case of manual test definitions, these are manual setup/cleanup tasks. Note: The challenge in executing setup and cleanup test definitions is preventing their results from being aggregated with the results of the regular test definitions that they support. Test Manager addresses this concern by running setup and cleanup test definitions (both automated and manual) in independent execution definitions, thereby isolating actual test results from incidental performance fluctuations that may be caused by setup and cleanup test definitions.

Combining Automated and Manual Test Definitions


Test Manager supports execution definitions that include combinations of automated test definitions and manual test definitions. Test Manager withholds execution of regular test definitions (both automated and manual) until setup test definitions are complete. Test Manager also ensures that all regular test definitions are complete before cleanup test definitions are run. When manual test definitions are combined with automated tests, automated tests (on all execution servers) do not begin until the setup processes are complete. In the case of manual setup test definitions, regular automated tests begin only after manual setup routines are complete.

Aborting Execution Definitions


When setup test definitions are aborted, regular tests do not execute, however cleanup test definitions do execute to restore the testing environment to its original state. Related Concepts Test Definition Execution Execution Definitions Related Procedures Managing Test Executions - Quick Start Task Configuring Setup and Cleanup Executions Creating Test Definitions Adding Execution Definitions Working with Data-Driven Tests Executing Test Definitions Related Reference Execution Setup/Cleanup tab

125

Calculating the Test Definition Status


When executing a test, the status of the test steps changes the overall status of the test definition. The following statuses are available for a test step or test definition:

Not Executed Passed Failed Unsupported Unresolved In Progress (only test definitions)
The calculation of the test definition status is based on the following rules, where the rules with lower numbers override the rules with higher numbers:
1 2 3 4

If the status of the test definition is Not Executed, and you change the status of a test step, the test definition status is set to In Progress. As long as there is at least one step with status Not Executed, the test definition status remains In Progress. If there is at least one step with status Failed or Unresolved, the test definition status is set to Failed. If the status of every test step is Passed or Unsupported, the test definition status is set to Passed.

Related Procedures Executing Manual Tests in the Current Run Page Related Reference Current Run Page

126

Manual Test Definitions


This section explains the execution of manual tests in Test Manager Manual test results integrate seamlessly with automated test results. Test Managers project status and test definition status analysis features operate the same with manual test results as they do with automated test results. The results of manual test definition executions are saved alongside the results of automated test definition executions in a central repository. Result files (for example, screengrabs) can also be attached to manual test definitions. In This Section Manual Test Execution Understanding manual-test execution in Test Manager. Tour of the Manual Testing Client UI An overview of the main elements of the Manual Testing Client user interface. Manual Testing Client The Manual Testing Client offers the full manual-test execution functionality through an Eclipse-based client tool. The Manual Testing Client is the recommended tool for executing manual tests with Test Manager.

127

Manual Test Execution


Although manual tests are scheduled and managed alongside automated execution definitions, their actual execution is quite different as they must be physically executed by a human tester. Test Managers Execution unit includes a manual testing interface that facilitates the running of manual tests. The scheduling of manual tests and the association of manual test definitions with execution definitions works the same as with automated tests. Tip: The Manual Testing Client, an Eclipse-based client tool, is the recommended tool for executing manual tests with Test Manager. Related Concepts Test Definition Execution Execution Definitions Tour of the Manual Testing Client UI Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Executing Test Definitions Related Reference Current Run Page Execute Test Dialog Box

128

Tour of the Manual Testing Client UI


Test Manager's Manual Testing Client is an Eclipse-based client tool that provides the functionality you need to efficiently execute manual tests. Here is an overview of the main elements of the Manual Testing Client user interface. Note: Eclipse does not initialize views until they are activated for the first time, so certain tabs within the Manual Testing Client will not display their contents until they are selected. For example, the Attachments tab will not display the number of attachments it tracks until it is selected.

Basic UI Structure
Manual Testing Client's GUI features eight main areas:

Menu Bar
The menu bar provides editing options for test definitions and views and access to Help. You can use the menu bar to edit user, appearance, or test settings, to navigate between execution packages, or to access Help. The available menus are listed in the following:

129

Menu

Description

File

Edit

Provides the same functionality as the workflow bar and additional functionality for package import and export, changing users, switching the Manual Testing Client between online and offline mode, and storing executed package results to Test Manager. Set the execution status of an execution package or test definition, add a result file to a test package or test definition, set the build number for an execution package, or eidt the code analysis settings of an execution package.
Item Description

Set the execution status of the execution package or test definition to the selected status. Add Result File... Add a result file to an execution package or test definition. Edit Build Number... Select the build number for an execution package. Edit Code Analysis Settings... Specify hostnames to include in code coverage runs for an execution package. The hostnames must be separated by commas. For example: labmachine1, 192.168.0.1:19129. Delete Delete an execution package. You can delete only execution packages with completed runs. Go Navigate between the available execution packages and test definitions. Window Edit the view and the preferences of the Manual Testing Client.
Item Description

Set as <Status>

Help

Activate and deactivate the available views in Manual Testing Client's Workspace. Reset Perspective... Reset the current Manual Testing Client perspective to the default state. Preferences.... Set the preferences for the Manual Testing Client. Access the documentation and About page of the Manual Testing Client.

Show View

Workflow Bar
The workflow bar gives you quick access to the basic functions you can perform with the Manual Testing Client. The following buttons are available in the workflow bar:
Button Description

Download Execute Finish Upload

Download the execution packages from SilkCentral Test Manager. Execute a test in the Manual Testing Client. Stop the execution of a test before the test is fully executed. Upload the execution packages to SilkCentral Test Manager.

Inbox
The Inbox lists all execution packages that have been downloaded to the Manual Testing Client for manual execution. Multiple execution packages can be selected within the Inbox window using standard Windows keyboard shortcuts. The Inbox includes the following properties for the selected test definition: The Inbox includes the following properties for the selected test definition:

130

Property

Description

Package Name Name of the execution package that has been downloaded from Test Manager. ID Execution package number that has been generated for this execution package. Status Status of the execution package (available values include Not Executed, Passed, Failed, Unresolved, and Unsupported). Execution package status values can not be edited. Overall package status is determined by the composition of the test definition statuses (and, by extension, the statuses of the test steps) that are contained within a package. For example, overall package status remains Not Executed as long as a test step of one of the contained test definitions has a status of Failed. So, even if some of a test definition's steps have passed, the overall status of the package will remain Not Executed until the package is finished and all unexecuted test definitions are assigned a status on the Finish Run dialog. The overall status will be considered Passed when one or more test steps or test definitions in a package are Passed and all unexecuted steps and test definitions are resolved through the Finish Run dialog. Priority Priority of the execution package. Keywords All keywords that are assigned to the execution package. Started At When testing of the execution package began. Project Name of the Test Manager project from which the execution package was derived. Version Product version from which the execution package was derived. Build Product build from which the execution package was derived. Execution Path File path where this execution definition resides in Test Manager's Executions Tree.

Completed Runs
The Completed Runs tab lists all execution packages for which testing is complete. The Completed Runs tab includes the following properties for the selected execution package:
Property Description

This column offers a status icon that indicates the upload status of corresponding execution packages. A red arrow icon indicates that a package's results have not yet been uploaded. A checkmark icon over a faint arrow icon indicates that a package's results have already been uploaded to the server. Double-click this icon to open the Execute Test dialog for the first test definition of this execution package. Package Name Name of the execution package that has been downloaded from Test Manager. ID Execution package number that has been generated for this execution package. Status Status of the execution package. Status values can not be edited on the Completed Runs tab. Overall package status is determined by the composition of the test definition statuses (and, by extension, the statuses of the test steps) that are contained within a package. For example, overall package status remains Not Executed as long as a test step of one of the contained test definitions has a status of Failed. So, even if some of a test definition's steps have passed, the overall status of the package will remain Not Executed until the package is finished and all unexecuted test definitions are assigned a status on the Finish Run dialog. The overall status will be considered Passed when one or more test steps or test definitions in a package are Passed and all unexecuted steps and test definitions are resolved via the Finish Run dialog. Priority of the execution package. All keywords that are assigned to the execution package. When testing of the execution package began. When testing of the execution package ended. Name of the Test Manager project from which the execution package was derived. Product version from which the execution package was derived.

Priority Keywords Started At Finished At Project Version

131

Build Product build from which the execution package was derived. Execution Path File path where the execution definition resides.

Test Definitions
The Test Definitions tab includes all information related to the manual test definition that is selected above in the Inbox or Completed Runs tab. Multiple test definitions can be selected within the Test Definitions window using standard Windows keyboard shortcuts. To apply a status change to selected test definitions, right-click the selection and select a new status value from the context menu. The Test Definitions tab includes the following properties for each test definition:
Property Description

# Name Status

Number that has been automatically generated for the test definition. Test definition name. Status of the test definition (available values include Not Executed, Passed, Failed, Unresolved, and Unsupported). This value can be changed by right-clicking the current value and selecting an alternative value from the context menu. Last Status Status that this test definition held before the current status. Steps Number of steps in the selected manual test definition. Planned Time Estimated time for completion of the test in [hh:mm:ss]. Used Time This field tracks elapsed time (in [hh:mm:ss]) since the start of the test execution. This field can be manually edited (the timer will stop during editing). After editing this field the timer will continue tracking time from the manually adjusted time. Test Definition Path File path where this test definition resides in Test Manager's Test Plan Tree.

Attachments
The Attachments tab lists any attachments related to the selected manual test definition. This tab is also available on the Execute Test dialog. When you have selected a test definition in the Test Definitions window, you have the option of supplementing the list of displayed attachments by selecting an Include attachments of value. Select Test Container/Folders to include all attachments from the selected test definition's test container or folder. Or select Test Steps to include attachments from the test steps of the test definition. The Attachments tab includes the following properties for each attachment:
Property Description

Name Type Description Source Image Preview

Name of the attachment. Attachment file type. Description that has been created for the attachment (if any). File path where this attachment's test definition resides in Test Manager's Test Plan Tree. If the attachment is an image, you can use the Image Preview controls to view the attachment. Right-click the image, or click the buttons to the right of the window, to access the following commands: Show Actual Size, Scale to Fit, and Scale to Fit Keep Aspect Ratio. Click Open as Detached Window to open Image Preview in a separate window.

132

Result Files
The Result Files tab lists any result files that are related to the selected manual test definition. This tab is also available on the Execute Test dialog. The Result Files tab includes the following properties for each result file: Name Source Add File Paste Image Remove Image Preview Name of the result file. File path where this result file's test definition resides in Test Manager's Test Plan Tree. Click to browse to and select a new result file for upload to this test definition. Click to paste an image from your computer's clipboard and attach the image to this test definition. Click to remove the selected result file attachment from this test definition. If the result file is an image, you can use the Image Preview controls to view the result file. Rightclick the image, or use the buttons to the right of the window, to access the following commands: Show Actual Size, Scale to Fit, and Scale to Fit Keep Aspect Ratio. Click Open as Detached Window to open Image Preview in a separate window.

Issues
The Issues tab lists any issues related to the selected manual test definition. This tab is also available on the Execute Test dialog. The Issues tab includes the following properties of each issue: Issue ID Synopsis Status External ID ID that has been assigned to this issue. Synopsis that has been written for this issue. Status of the issue. Indicates if the issue is tracked by an external issue tracking system. If this issue is tracked by an external issue tracking system, and that issue has been assigned an ID, you can click the external ID number in this field to link directly to the issue in the external issue tracking system. Created On When the issue was created. Created By User who created the issue.

Outline
Shows the content tree of the selected execution package or the location of the selected test definition in the execution package.

Description
Shows the description of the selected execution package or test definition.

Status Bar
The status bar shows the current amount and status of the execution packages and test definitions in the currently active view.
Button Description

Online/Offline Click to switch Manual Testing Client's mode from online to offline and back.

133

Related Concepts Manual Testing Client Test Definition Parameters Test Definitions in the Manual Testing Client Related Procedures Using the Manual Testing Client Editing Test Definitions Within the Manual Testing Client Adding an Internal Issue with the Manual Testing Client Related Reference Execute Test Dialog Box

134

Manual Testing Client


Test Managers Manual Testing Client enables testers to manage their tests, edit test definitions, and track results without the need of an Internet connection. The Manual Testing Client offers the full manual-test execution functionality through an Eclipse-based client tool. The Manual Testing Client is the recommended tool for executing manual tests with Test Manager. Note: Test Managers Manual Testing Client supports running test executions with code analysis information. If a test execution has already been enabled to gather code analysis information within Test Manager, the settings are automatically available in the Manual Testing Client. Code analysis can also be enabled for execution definitions from within the Manual Testing Client.

Related Concepts Manual Test Definitions Test Definition Execution Execution Definitions Tour of the Manual Testing Client UI Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Enabling Code Analysis for SilkCentral Test Manager Using the Manual Testing Client Working with Manual Tests Executing Test Definitions Related Reference Current Run Page

135

SilkTest Tests
This section explains how to execute test definitions in SilkTest. In This Section SilkTest Logs RMS log files are used to log data for each test case as test runs progress. SilkTest Time-out Settings Information about setting SilkTest time-out settings. Automated Execution of Data-Driven SilkTest Testcases Execution mode options for data-driven SilkTestTestcases. Automated Execution of SilkTest Test Definitions Information about automatic execution of SilkTest tests. Specifying Agent Under Test (AUT) When a SilkTest agent cannot run on the same machine as the Test Manager execution server, the hostname and port should be specified.

136

SilkTest Logs
SilkTests RMS log file is used to log data for each test case as test runs progress. Three types of data records are written to this file: status, memory and user records. By monitoring this file, the RMS Remote Agent has a means of determining the progress of each test run. You can write your own comments into the user records of the log file by executing the PrintToRMSLog 4Test function. Examples: PrintToRMSLog ("Error", "An intended error"). PrintToRMSLog ("Info", "testcase sleep1 started") PrintToRMSLog ("Warning", "TestCase 1 started a second time") Definition of user function in rms.inc: PrintToRMSLog (STRING sMessageType, STRING sUserMessage) writes to the log file in the following format: U|{sTestCaseName}|{sScriptName}|{sArgStr}|{sUserMessage}|{sMessageType} Related Concepts SilkTest Tests Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Executing Test Definitions Related Reference Execution Unit Interface

137

SilkTest Time-out Settings


If you have SilkTest test cases that require more than 1 hour to complete, you must adjust Test Managers timeout settings. Otherwise, Test Manager assumes that something has gone wrong in the execution and terminates SilkTest. For details about setting the SilkTest timeout, please see SilkCentral Administration Module documentation. Related Concepts SilkTest Tests Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Executing Test Definitions Related Reference Execution Unit Interface

138

Automated Execution of Data-Driven SilkTest Testcases


If the data driven check box is checked in SilkTest test properties, each SilkTest test will be repeated once for each data row in the external datasource. By default, plan-based execution mode is used for data driven tests. This means that the results of all data rows will be listed under a single node in the result file (.res). If execution mode is switched to script-based data driven in SccExecServerBootConf.xml, a result node will be created in the result file (.res) for each data row. Related Concepts SilkTest Tests SilkTest Test Definitions Data-Driven Tests Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Running Automated Tests Creating Data-Driven Execution Definitions Executing Test Definitions Related Reference Execution Unit Interface

139

Automated Execution of SilkTest Test Definitions


All test definitions within an execution definition use the same SilkTest instance for tests. The SilkTest GUI is opened once with the first SilkTest test definition execution and its closed automatically after the last SilkTest test definition execution. Each SilkTest test definition execution produces its own results. If for any reason the SilkTest GUI closes during a test, it will reopen automatically with the next SilkTest test definition execution. Related Concepts SilkTest Tests Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Running Automated Tests Executing Test Definitions Related Reference Execution Unit Interface

140

Specifying Agent Under Test (AUT)


When a SilkTest agent cannot run on the same machine as the Test Manager execution server (for example, when tests are run on platforms other than Windows), the hostname and port may be specified by the SilkTest AUT Hostname setting in the Deployment tab of an execution definition. If the setting has not been defined, SilkTest default values (for example, from partner.ini) will be used. Syntax for AUT is hostname:port. The agent must be started manually prior to test execution and configured to listen at the specified port. By default, the TCP/IP protocol is used for communication between SilkTest instances and SilkTest agents. Ensure that both programs have been configured to use the same protocol. Note: Be careful when you have multiple execution servers assigned to an execution definition as SilkTest agents can only work with one SilkTest instance at a time. Related Concepts SilkTest Tests Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Executing Test Definitions Related Reference Execution Unit Interface

141

Issue Management
The Issues unit helps you track the issues that are associated with the selected project. You can work with a detailed tabular listing of statistics ( Document View ) or a chart view ( Issues View ). Issues from both internally and externally configured issue databases are tracked. Statistics can be reported individually or cumulatively across tracking systems. Out-of-the-box support is offered for the following issue-tracking systems: SilkCentral Issue Manager, Borland StarTeam, and IBM Rational ClearQuest. See SilkCentral Administration Module documentation for details on setting up these systems. Other external issue-tracking systems can be integrated through Test Managers Java API and Web Service interface. Refer to the Test Manager API Help for details. Note: Issues are also tracked on the test-definition and test-container level in the Test Plan unit. New issues can be entered and associated with test definitions in the Test Plan Unit, Issues tab. New issues can also be entered from the Activities tab (Projects Unit).

Document View
Test Managers Document View offers an overview of the states of all project-related issues in the form of an issuestate statistics table. The Issues tree displays all issue-tracking systems and associated Test Manager products that have been configured for those systems. The internal tracking system that has been configured for Test Manager is called Internal. Note: Products are configured through Test Managers Administration unit (Administration/Configuration/Products). See the SilkCentral Administration Module Help for information on configuring products. Current status statistics for the currently selected tree node are shown in the table on the right. The Date column shows the date/time of recent updates. Each row in the table shows the number of issues that have each columns respective status. For the project node and issue-tracking system nodes, statistics are accumulated values of the respective child nodes in the tree. Statistics are calculated by fetching information from the issue tracking systems. This function is performed periodically by the SilkCentral application server. By default, this occurs once each hour. The interval can be customized in TMAppServerHomeConf.xml by setting the minutes value in IssueStateUpdate/UpdateInterval. The application server must be restarted to activate changes and initiate countdown for the first run.

Issues View
Issues View provides historical information for issues in a chart view. The view reflects the status values that were retrieved from the tracking systems (both external and the internal systems) each day. If the product node of the Issues tree is selected, all statistics for all issue-tracking systems and all products will be retrieved. When a product is selected in the Issues tree, statistics for only that product are retrieved. When a specific issue-tracking system is selected, statistics for only that system are displayed. Issues View includes:

Calendar - Enables you to define the time frame across which statistics are to be calculated. Chart - Tracks issue status counts over the specified time frame.

142

Table - Shows the values reflected by the chart (for the past five days of the selected time frame only).
Note: The Issues tree displays only those external issue-tracking systems and products that have at least one issue assigned to them. The internal tracking system is always displayed. Related Concepts SilkCentral Issue Manager Test Definition Execution Upload Manager Related Procedures Managing Test Executions - Quick Start Task Tracking Issues Working with Issues Executing Test Definitions Related Reference Issues Unit Interface Execution Unit Interface Calendar Tool

143

Project Management
This section explains how to manage projects in Test Manager. The Projects unit offers a high-level test-managers view of all projects in your Test Manager installation. The Projects unit enables you to move between projects, see high-level project status details, and view current execution statistics. In This Section Build Information Build information files contain version and build information that is used for execution runs. Build Information Updates Whenever a new build becomes available for testing on an execution server, build information must be updated.

144

Build Information
Build information files, which contain version and build information used for execution runs, are typically stored and searched for on the execution server that is executing a run. When a build information file is not found there, then the file is searched for on the application server. This behavior is beneficial when you have several execution servers and need to use a single build information file across all execution servers. You only need to maintain a single build information file on the application server. Similarly, when no execution server has been assigned to a test definition (and also for manual tests), build information files are searched for on the application server. Test Manager is able to match up test results with build information and display test results for specific build numbers. Related Concepts Build Information Updates Successful Test Management Related Procedures Managing a Successful Test Related Reference Projects Unit Interface

145

Build Information Updates


Build information files must be created and configured manually. Whenever a new build becomes available for testing on an execution server, update build information to reflect the new build number. This can be done in one of two ways:

Manually, by editing the file(s) each time a new build is installed. Automatically, if you are using an automated build update process to update the build information file (for
example, through VB Script).

Related Concepts Build Information Successful Test Management Related Procedures Managing a Successful Test Related Reference Projects Unit Interface

146

Report Generation
This section explains how to generate and view SilkCentral Test Manager reports. In This Section New Report Creation This section explains how to create new reports with SilkCentral Test Manager. Context-Sensitive Reports The Requirements, Test Plan, and Execution units offer dynamically-generated lists of reports that are specific to each unit. Project Overview Report The Project Overview Report contains a high-level overview of the status of the selected project. Test Manager 8.0 Reports Any reports created for a Test Manager 8.0 installation will appear in the Reports unit. Requirements Reports This section explains the requirements-related reports. Test Plan Reports This section explains the test-plan reports that ship with SilkCentral Test Manager. Execution Reports This section explains the execution reports that ship with SilkCentral Test Manager. Code Coverage Reports This section explains the code coverage reports that ship with SilkCentral Test Manager. Performance Trend Reports This section explains the performance trend reports that ship with SilkCentral Test Manager. Issues Per Component Report Test Manager offers one issues-related report. Code-Change Impact Reports Test Managers code-change impact reports enable you to perform testing-impact analysis, effort analysis, and risk analysis.

147

New Report Creation


This section explains how to create new reports with SilkCentral Test Manager, download report templates, edit report parameters, and create new reports based on pre-installed templates. It also includes descriptions of all default report types that come pre-installed with Test Manager. In This Section New Reports Creating new reports with Test Manager. SQL Functions for Custom Reports This table lists all available function placeholders.

148

New Reports
This topic explains how to create new reports with Test Manager, edit report parameters, and create new reports based on pre-installed templates. Test Manager offers reports that quickly and easily transform data into intuitive charts and graphs. Pre-installed reports are available for Test Managers Requirements, Test Plan, and Issues units. Reports are created using either BIRT RCP Designer, an open-source, Eclipse-based report tool, or Microsoft Excel report templates. SilkCentral Test Manager is tightly integrated with BIRT RCP Designer to make it easy for you to generate reports on test management data. Test Managers reporting functionality is highly customizable. Numerous pre-installed reports and report templates provide out-of-the-box options for a wide range of reporting needs. Simple GUI-based tools allow you to edit Test Managers pre-installed reports and create reports of your own. For users with SQL knowledge, there is virtually no limit to how data can be queried and presented in custom reports. Note: For information about editing report templates and creating custom report templates using BIRT RCP Designer and MS Excel, see the SilkCentral Administration Module Help. Tip: If a blank report is generated, the cause may be that there are not any data in the project you selected, or you may not be connected to the appropriate SilkCentral Test Manager database. Tip: Reports are not available offline unless your SilkCentral Test Manager database is accessible locally.

Sample Report
Below is the code of a pre-installed report called 'All Requirements'. This report has not undergone editing using Test Manager's GUI-based tools or SQL. By default, this report displays all properties of all requirements in the selected project, except those requirements that have been identified as obsolete. Obsolete requirements are filtered out by the reports reqProp_Obsolete_0 parameter.

SELECT r.ReqID, r.ReqParentID, r.PositionNumber, r.ProjectID, r.ProjectName, r.ReqName, r.Risk, r.Priority, r.ReqDescription, r.ReqCreator, r.ReqCreated, r.ReqReviewed, r.ReqCoverageStatus, r.ReqRevision, r.MarkedAsObsolete, r.Obsolete, r.TreeOrder FROM RTM_V_Requirements r WHERE r.ReqID IN (SELECT DISTINCT ReqTreeNodeID_pk as id FROM TM_RequirementTreeNodes rtn WITH (NOLOCK) WHERE rtn.ProjectID_fk = 98 AND rtn.MarkedForDeletion=${reqProp_Obsolete_0|0} AND ParentTreeNodeID_fk IS NOT NULL)

Downloading Report Templates


Test Manager report templates render report data into formats that meet your specific needs. Templates can take the form of MS Excel spreadsheets, BIRT RCP Designer templates, XML, or CSV files.

149

Enabling Links to Data in the Data Tab


You can access the requirements, test definitions, or execution definitions that you query for directly from the results list in the Data tab. To do so, your query must include the column ProjectID and the respective ID of the element that you want to link to:

RequID: Query for this column to enable a link to requirements on the Data tab of a report. TestDefID: Query for this column to enable a link to test definitions on the Data tab of a report. ExecDefID: Query for this column to enable a link to execution definitions on the Data tab of a report.
If the query's result includes both the ProjectID and either of RequID, TestDefID, or ExecDefID, using exactly these terms as column names, the Data tab will display the values in the element ID's column as a link. If you click such a link, Test Manager will switch to that element in the tree.

Bookmarking Reports
The bookmark button in the lower-right corner of the workflow bar bookmarks the currently displayed report, including the parameters that you have set in the Parameters tab. You can send bookmark URLs to other Test Manager users, allowing them to view reports with a single click. The bookmark URL contains the parameters, prefixed with rp_. Date values are represented as the correlating Long values in UTC in the URL. Related Concepts Report Generation Related Procedures Analyzing Test Results - Quick Start Task Creating New Reports Customizing Reports with BIRT Generating Reports Managing Reports Related Reference Reports Unit Interface

150

SQL Functions for Custom Reports


To assist in writing advanced queries, placeholders are available for each function. Function placeholders are replaced with SQL code upon execution. Functions are used like parameters, but their names have a $ (dollar symbol) as a prefix. Unlike parameters, placeholders are defined report elements that cannot be customized per execution. The following table lists all available function placeholders:
Function What it does Example

$TODAY

Gives the current systemdate (on the database server). You can also write $TODAY-1 (for yesterday) or $TODAY-7 (for a week ago) Returns the date (does not include the time) Converts the given string to a database date

CreatedAt > ${$TODAY}

$DATE(column) $DATE('string') $DAYS[p1;p2]

CreatedAt > ${$DATE('01/10/2005')}

Calculates the difference in days between the two given ${$DAYS[CreatedAt;$TODAY]} > 7 parameters. The two parameters can be a column within (returns the rows created within the last week) the table/view or $TODAY. Returns the week-number of the given parameter, which can be $TODAY or a column. Returns the month of the year as a number of the given parameter, which can be $TODAY or a column. Returns the year as a number of the given parameter, which can be $TODAY or a column. The ID of the currently logged in user. The name of the currently logged in user. The ID of the currently selected project.

$WEEK(param) $MONTH(param) $YEAR(param) $USERID $USERNAME $PROJECTID

$PROJECTNAME The name of the currently selected project. $REPORTNAME $REPORTID The name of the currently selected report. The ID of the currently selected report.

Sample Custom Report Below is the code of the pre-installed Requirement with Child Requirements report. With this report, a selected requirement is shown with its requirement ID. Full details regarding the requirements child requirements are displayed. Although not a custom report, this report is a helpful example because it makes use of the $PROJECTID function. It also includes two parameters, reqID (requirement ID) and reqProp_Obsolete_0 (show obsolete requirements).

SELECT r.ReqID, r.ReqCreated, r.ReqName, r.TreeOrder FROM RTM_V_Requirements r INNER JOIN TM_ReqTreePaths rtp ON (rtp.ReqNodeID_pk_fk = r.ReqID) WHERE rtp.ParentNodeID_pk_fk=${reqID|22322|Requirement ID} AND r.ProjectID = ${$PROJECTID} AND r.MarkedAsObsolete=${reqProp_Obsolete_0|0|Show obsolete Requirements} ORDER BY r.TreeOrder ASC

151

Related Concepts Report Generation Related Procedures Creating New Reports Creating Reports Generating Reports Managing Reports

152

Context-Sensitive Reports
The Requirements, Test Plan, and Execution units offer dynamically-generated lists of reports that are specific to each unit. Context-sensitive report lists are helpful because they offer report types that relate directly to your current activities.

Requirements unit: Context-sensitive report lists in the Requirements tree offer all reports that take
requirement ID as an input parameter. as an input parameter.

Test Plan unit: Context-sensitive report lists in the Test Plan tree offers all reports that take test-definition ID Execution unit: Context-sensitive report lists in the Execution Definition tree offer all reports that take
execution-definition ID as an input parameter. On the execution-definition Runs tab, context-sensitive report lists offer all reports that have the following configuration: Result category = Execution Definition Selection criteria = Execution Definition Run Property = ID When you select a report from a context-sensitive report list, you are taken directly to that report's default tab in the Reports unit. This default destination-tab behavior can be configured using each report's Edit Report dialog box. There are two types of reports that appear in the context-sensitive report lists: reports that you have already accessed and reports that you have not yet accessed. Reports that you have accessed previously appear above a line separator in the menu. These reports are listed chronologically with the most recently viewed report at the top of the list. Other default reports that are available, but have not yet been accessed, appear beneath the line separator. In addition to the default-configured context-sensitive reports, you can configure new and existing reports to be included in each unit's context-sensitive report list. Context sensitivity is added to reports on a per-user, per-report basis only. Related Concepts Report Generation Related Procedures Managing Reports Accessing Context-Sensitive Reports Enabling Context-Sensitive Reports

153

Project Overview Report


Test Managers Project Overview Report contains a high-level overview of the status of the selected project. The Project Overview Report is accessed through the Projects Unit (Overview tab). Related Concepts Project Management Report Generation Related Procedures Analyzing Test Results - Quick Start Task Creating New Reports Customizing Reports with BIRT Generating Reports Managing Reports Related Reference Overview tab Reports Unit Interface

154

Test Manager 8.0 Reports


When updating from Test Manager 8.0 to the current version of Test Manager, the reports that existed in the Test Manager 8.0 installation will appear in the Reports unit in a folder called Reports < TM 8.2. Related Concepts Report Generation Related Procedures Analyzing Test Results - Quick Start Task Customizing Reports with BIRT Generating Reports Managing Reports Related Reference Reports Unit Interface

155

Requirements Reports
This section explains the requirements-related reports that ship withSilkCentral Test Manager. Requirements reports detail the status of functional requirements (for example, compatibility requirements, GUI requirements, feature requirements) that must be met during development. Requirements may also relate to product management objectives such as reliability, scalability, and performance. Test Managers requirement-management reports help managers determine if adequate test coverage has been established to verify that system requirements are met during development. When a report references a requirement that includes HTML-formatted content, that content is rendered in the report. The following reports come pre-installed with Test Manager. In This Section Status Reports Here are the status reports that are available for Test Manager's Requirements unit. Progress Reports Here are the progress reports that are available for Test Manager's Requirements unit. Document Reports Here are the document reports that are available for Test Manager's Requirements unit. All Related Issues Report Provides a detailed list of all issues related to the assigned test definitions for a requirement.

156

Status Reports
Here are the status reports that are available for Test Manager's Requirements unit.

Requirements Status Overview


Represents a grouped summary of all requirements by current requirement coverage. Coverage is expressed by the statuses Passed, Failed, Not Executed, and Not Covered.

Top-Level Requirement Coverage


Represents a listing of all top-level requirements. For each requirement the number of covered and not-covered (by test definitions) child requirements is displayed.

Status of Requirements with Priority Higher than 'X'


Represents a summary of all requirements by current requirement coverage. The returned group of requirements is restricted by the Priority parameter, which specifies the lowest requirement priority that is considered in the data. Related Concepts Requirements Management Report Generation Related Procedures Analyzing Test Results - Quick Start Task Creating New Reports Customizing Reports with BIRT Generating Reports Managing Reports Related Reference Reports Unit Interface

157

Progress Reports
Here are the progress reports that are available for Test Manager's Requirements unit.

Requirements Coverage Across Builds 'X' and 'Y'


Represents a trend in requirements coverage resulting from viewing requirements coverage in context with the builds. The user must specify a build range consisting of a start- and an end-build.

Requirements Coverage Over the Past 'X' Days


Represents a trend in requirements coverage by considering overall requirements coverage for a specific number of days 'X'.

Specific Requirements Coverage Over the Past 'X' Days


Represents a trend in requirements coverage by considering specific requirements for a specific number of days 'X'. Related Concepts Requirements Management Report Generation Related Procedures Analyzing Test Results - Quick Start Task Creating New Reports Customizing Reports with BIRT Generating Reports Managing Reports Related Reference Reports Unit Interface

158

Document Reports
Here are the document reports that are available for Test Manager's Requirements unit.

All Requirements
All requirements are represented with full requirement information.

All Requirements with History


All requirements are represented with full requirement information and history.

Requirement with Child Requirements


The selected requirement is shown with its requirement ID. Full details regarding the requirements child requirements are displayed. Related Concepts Requirements Management Report Generation Related Procedures Analyzing Test Results - Quick Start Task Creating New Reports Customizing Reports with BIRT Generating Reports Managing Reports Related Reference Reports Unit Interface

159

All Related Issues Report


Provides a detailed list of all issues related to the assigned test definitions for a requirement, and explains the relationship between requirements, the assigned test definitions, and issues that have occured.

Input Parameters
The input parameter for an all related issues report is the identifier of the requirement.

Overview
The All Related Issues report is divided into the folllowing sections:

General Report Information Requirement Information Related Issues

General Report Information


This section provides the following general information about the report: Project Name Name of the active project. Report Description Description of the report. Report Executed By User who executed the report.

Requirement Information
This section provides the following information about the requirement: ID Name Description Test Coverage Nr. of Issues Identifier of the requirement. Name of the requirement. Description of the requirement. Status of all test definitions that have been assigned to the requirement. Amount of issues related to the requirement or sub-requirements of the requirement.

Related Issues
This table shows all issues related to the requirement or sub-requirements of the requirement. The detailed information provided for each issue is: ID Synopsis Status Assigned by Test ID Identifier of the issue. If an identifier is provided by the issue tracking system, this external identifier is used. The identifier is clickable if an external link is defined for the issue. Meaningful short-description of the issue. Current status of the issue. If the status is provided by the issue tracking system, this external status is used. Person who assigned the issue to the test definition. Identifier of the test definition in which the issue was discovered.

160

Test Definition Name of the test definition in which the issue was discovered. Related Concepts Run Comparison Reports Execution Reports Test Definition Run Comparison Report Related Procedures Generating Reports

161

Test Plan Reports


This section explains the test-plan reports that ship with SilkCentral Test Manager. Test Plan reports give you an overview of the progress of your test definitions and the status of defects over a period of time or over a range of builds. The following reports come pre-installed with Test Manager. In This Section Status Reports Here are the status reports that are available for Test Manager's Test Plan unit. Progress Reports Here are the progress reports that is available for Test Manager's Test Plan unit. Manual Test Reports Manual-test reports that are available for Test Manager's Test Plan unit.

162

Status Reports
Here are the status reports that are available for Test Manager's Test Plan unit.

Test Definition Status Overview


Represents a status overview of all test definitions, structured by the statuses Passed, Failed, Not Executed, and Not Scheduled.

Test Definition Status Overview (per test container)


Represents a status overview of all test definitions contained in a specific test container, structured by the statuses Passed, Failed, Not Executed, and Not Scheduled.

Test Definitions per Component


Represents an overview of coverage of components by test definition; makes it easier to see where testing activity is needed.

Passed Test Definitions (per test container)


Represents a success rate for each test container by listing the number of passed test definitions.

Implemented Test Definitions (per component)


Represents an overview of coverage of components by test definitions that have the Implemented attribute set to Yes.

Failed Test Definitions (per component)


Represents an overview of failed test definitions per component; makes it easier to identify the most critical components in the environment. Related Concepts Test Plan Management Report Generation Related Procedures Analyzing Test Results - Quick Start Task Creating New Reports Customizing Reports with BIRT Generating Reports Managing Reports Related Reference Reports Unit Interface

163

Progress Reports
Here are the progress reports that is available for Test Manager's Test Plan unit.

Test Definition Progress Across Builds 'X' and 'Y'


Represents a trend in test definition progress resulting from viewing test definition statuses in context with builds. The user must specify a build range consisting of a start- and an end-build.

Test Definition Progress this Month


Represents a trend in test definition progress resulting from viewing test definition statuses for the current month.

Specific Test Plan Node Progress Over the Past 'X' Days
Represents a trend in test definition progress by considering a specific test plan node over the past 'X' number of days.

Test Definitions Created in the Past 'X' Days (per component)


Represents a listing of new test definitions over the past 'X' number of days per component. Assists in identifying components that lack testing activity.

Test Definition Progress Over the Past 'X' Days


Represents a trend in test definition progress by considering test definition statuses overall for the last 'X' number of days.

Percentage Testing Success Over the Past 'X' Days (per component)
Represents a percentage listing of successful test definitions over the last 'X' number of days per component; assists in identifying the components in the environment that are most critical. Related Concepts Test Plan Management Report Generation Related Procedures Analyzing Test Results - Quick Start Task Creating New Reports Customizing Reports with BIRT Generating Reports Managing Reports Related Reference Reports Unit Interface

164

Manual Test Reports


The following manual-test reports are available for Test Manager's Test Plan unit:

Planned vs. Actual Execution Time of Manual Tests (Summary)


Represents an overview of the deviation between planned and actual time for execution of manual tests, viewed on a daily basis.

Historic Planned vs. Actual Execution Time (per user)


Represents an overview of planned and actually required execution time for completed manual test definitions per user over a specific period of time.

Planned vs. Actual Execution Time (status per user)


Represents progress in terms of planned vs. actual hours of currently pending manual test definitions per user. Manual test definitions are only considered if test results have been entered by the user and are assigned to the user who enters the results.

Manual Test Definition Result Document


An easily printable manual test case result report for a single test definition.

Manual Test Definition Document


An easily printable manual test case report for a single test definition. Related Concepts Manual Test Definitions Report Generation Related Procedures Analyzing Test Results - Quick Start Task Creating New Reports Customizing Reports with BIRT Generating Reports Managing Reports Related Reference Reports Unit Interface

165

Execution Reports
This section explains the execution reports that ship with SilkCentral Test Manager. To ease the assesment of results, execution reports give you a detailed overview of the progress of your test executions and the status of defects, over a period of time, or over a range of builds. The following reports come pre-installed with Test Manager. In This Section Run Comparison Reports Describes the run-comparison reports that are available in Test Manager. Execution Definition Run Comparison Reports Compare two runs of an execution definition. Test Definition Run Comparison Report Compares two runs of a test definition. Execution Definition Run Errors Report Provides a detailed list of all test definitions that did not pass an execution definition run and the reason they did not pass.

166

Run Comparison Reports


The following run-comparison reports are available in Test Manager:

Execution-definition run-comparison reports Test-definition run-comparison reports


Note: These reports are not suitable for the comparison of manual tests to automated tests. When the name of the report includes (Last Two Runs), you can compare only the last two runs of the execution definition or test definition.

Execution Definition Run Comparison Reports


Execution-definition run-comparison reports present an overview of the comparison between two runs of an execution definition.

Test Definition Run Comparison Reports


Test-definition run-comparison reports present an overview of the comparison between two runs of a test definition. Related Concepts Execution Reports Execution Definition Run Comparison Reports Test Definition Run Comparison Report Related Procedures Generating Reports

167

Execution Definition Run Comparison Reports


The following types of reports compare two runs of an execution definition:
Report Name Description

Execution Definition Run Comparison Report

The default execution-definition run-comparison report that compares two runs of the execution definition. Execution Definition Run Comparison Report Failed in Compares only the failed tests of two executionNewer Run definition runs. Execution Definition Run Comparison Report Compares only those tests of two execution-definition Changed Status runs, that changed their statuses.

Overview
The execution-definition run-comparison report provides the following details:

Changes to the status of the execution definition Number of errors Number of warnings Context in which the execution definition was executed Execution duration of the assigned tests
Note: When the status of an assigned test changes to Failed between compared runs, the test is marked red. When the status of an assigned test changes to Passed between compared runs, the test is marked green. The execution-definition run-comparison report includes the folllowing sections:

General Report Information Execution Definition Information Execution Definition Run Comparison Test Definition Run Comparison

General Report Information


This section provides the following general information about the report: Project Name Name of the active project. Report Description Description of the report. Report Executed By User who executed the report.

Execution Definition Information


This section provides the following information about the execution definition: ID Name Identifier of the execution definition. Name of the execution definition.

168

Description Description of the execution definition. Product Name of the product specified for the run.

Execution Definition Run Comparison


This section identifies the following differences between the two runs: Execution Timestamp Duration Run ID Version Build Status Execution time of each run. Duration of each run. ID of each execution definition run. Version of the product specified for the run. Build of the product specified for the run. Status of each execution-definition run.

Test Definition Run Comparison


This section provides the following details about the test definitions assigned to each execution definition run: ID Name Status Duration Errors Warnings ID of each test definition. Name of each test definition. Status of each test definition in each execution-definition run. Duration of each test definition in each execution-definition run. Number of errors of each test definition in each execution-definition run. Number of warnings of each test definition in each execution-definition run.

Related Concepts Run Comparison Reports Execution Reports Test Definition Run Comparison Report Related Procedures Generating Reports

169

Test Definition Run Comparison Report


The test-definition run-comparison report compares two runs of a test definition. This report provides the following information:

Changes to the status of the test definition Number of errors Number of warnings Context in which the test definition was executed Execution duration of the assigned tests Attributes and properties of the test definition Parameters of the test definition Success conditions for the test definition
The test-definition run-comparison report is divided into the following sections.

General Report Information


This section provides the following general information about the report: Project Name Name of the active project. Report Description Description of the report. Report Executed By User who executed the report.

Test Definition Information


This section provides the following information about the test definition: ID Identifier of the test definition. Name Name of the test definition. Description Description of the test definition.

Execution Information
This section provides the following information about each execution: Execution Definition ID Execution Definition Name Run ID Product Version Build ID of each execution definition. Name of each execution definition. ID of each execution definition run. Name of the product. Version of the product. Build of the product.

Test Definition Run Comparison


This section identifies the following differences between the two runs: 170

Status Execution Timestamp Duration Errors Warnings Previous Status Changed by Change Comment

Status of each run. Timestamp of each run. Duration of each run. Number of errors in each test definition run. Number of warnings in each test definition run. Status of each run previous to the last manual change. User who performed the last manual change to the status. Describes the reason of the manual status change.

Attributes and Properties


This section identifies the attributes and properties of the two runs of the test definition at execution time.

Parameters
This section lists the parameters of the two runs of the test definition at execution time.

Success Conditions
This section lists the conditions at execution time for each of the two runs to be considered successful. If a condition is not satisfied, the test definition run is considered unsuccessful. Satisfied conditions are marked green, while unsatisfied conditions are marked red. Related Concepts Run Comparison Reports Execution Reports Execution Definition Run Comparison Reports Related Procedures Generating Reports

171

Execution Definition Run Errors Report


Provides a detailed list of all test definitions that did not pass an execution-definition run and the reason they did not pass. All errors that occured during the execution-definition run are listed in this report. The user can quickly assess results and easily identify any unwanted effects in the execution-definition run.

Input Parameters
The input parameter for an Execution Definition Run Errors report is the identifier of the execution-definition run.

General Report Information


This section provides the following general information about the report: Project Name Name of the active project. Report Description Description of the report. Report Executed By User who executed the report.

Execution Definition Information


This section provides the following information about the execution definition: Execution Definition ID Execution Definition Name Run ID Product Version Build Execution Server Keywords Execution Timestamp Duration Status Identifier of the execution definition. Name of the execution definition. Identifier of the execution-definition run. Name of the product specified for the execution-definition run. Version of the product specified for the execution-definition run. Build of the product specified for the execution-definition run. Execution server where the execution definition was run. Keywords assigned to the execution-definition run. Time and date of the execution-definition run. Duration of the execution-definition run. Status of all test definitions assigned to the execution definition.

Test Definition Runs


This section provides the following information about each test definition run that did not pass: ID Name Duration Errors Warnings Messages Identifier of the test definition. Name of the test definition. Duration of the test definition run. Amount of errors that occured during the test definition run. Amount of warnings that occured during the test definition run. If there are messages available, the content of the Messages tab of the Test Definition Run Dialog is shown here.

172

Related Concepts Execution Reports Related Procedures Generating Reports

173

Code Coverage Reports


This section explains the code coverage reports that ship with SilkCentral Test Manager. Code coverage reports offer a detailed overview of your product's code coverage over a period of time or range of builds. The following reports come pre-installed with Test Manager. In This Section Code Coverage Trend Report Shows the improvement trend of code coverage for methods, classes, and packages for a product over a selected range of builds. Method Coverage Comparison Report Compares method coverage for all included packages across two product builds.

174

Code Coverage Trend Report


Shows the improvement trend of code coverage for methods, classes, and packages for a product over a selected range of builds.

Input Parameters
The input parameters for a code coverage trend report are: product_ProductVersion Version of the selected product. BuildFrom First build in the range of examined builds. BuildTo Last build in the range of examined builds

General Report Information


Report Description Description of the report. Report Executed By User who executed the report. Product Information Name, version, and examined build range of the selected product.

Code Coverage Trend Graph


Shows the overall percentage of code coverage for the selected product over the selected range of builds. Code coverage for specific packages, classes, and methods is displayed individually.

Code Coverage Trend Details


Displays the information from the Code Coverage Trend Graph in a tabular format. Related Concepts Code Coverage Reports Related Procedures Generating Reports

175

Method Coverage Comparison Report


Compares method coverage for all included packages across two product builds.

Input Parameters
The input parameters for a method coverage comparison report are: Build 1 Build 2 Product Threshold Number of the first build that is to be compared. Number of the second build that is to be compared. The examined product. The minimum amount of change that results in a package appearing in the report. Packages with a smaller percentage of change are not shown in the report. The threshold range is from 0 to 100 percent.

General Report Information


Project Name Report Description Report Executed By Product Build 1 Build 2 Name of the project. Description of the report. User who executed the report. Name of the selected product. Number of the first build to be compared. Number of the second build to be compared.

Method Coverage Information


The method coverage table shows the following information for all packages that have changes in method-coverage percentage that are bigger than the threshold: Package Name Statements % Method Coverage % Difference Name of the package. Number of statements that are included in the package. Percentage of method coverage in the second build. Difference in the code coverage percentage from the first build to the second build. The difference is negative when code coverage drops.

Related Concepts Code Coverage Reports Related Procedures Generating Reports

176

Performance Trend Reports


This section explains the performance trend reports that ship with SilkCentral Test Manager. Performance trend reports show the evolution of the application under test's performance over a specified period of time. The input data for the performance reports is provided by SilkPerformer load tests. The following reports come pre-installed with Test Manager. In This Section Average Page-Time Trend Report Shows the page times per page for all tests executed for the specified test definition within the specified time range. Average Transaction Busy-Time Trend Report Shows the transaction busy time per transaction for all tests executed for the specified test definition within the specified time range. Custom Measure Trend Report Shows the average, minimum, and maximum values of the defined measure or measures for all tests executed for the specified test definition within the specified time range. Overall Page-Time Trend Report Shows overall page times, aggregated over all user types, for all tests executed for the specified test definition within the specified time range. Overall Transaction Busy-Time Trend Report Shows overall transaction busy-time, aggregated over all user types, for all tests executed for the specified test definition within the specified time range.

177

Average Page-Time Trend Report


Shows the page times per page for all tests executed for the specified test definition within the specified time range. The performance trend of the page times for the tested pages is shown in a graph.

Input Parameters
The input parameters for an Average Page-Time Trend report are: Date From (DD-MON-YYYY) Starting date for the time range. For example: 06-DEZ-2008. Date To (DD-MON-YYYY) End date for the time range. For example: 16-JAN-2009. Exclude Runs with more than <nnn> Errors Runs that generate more errors than specified here are not included in the report. Use this setting to avoid that outliers skew the trend curve. Maximum Value for y-Axis Limits the y-axis of the graph to the specified value. Page-times that exceed this value are cut off at the top. This setting is useful to prevent the flattening of lines caused by outliers. Measure Filter Shown measures are limited to those including the specified string in their name. This field has to be filled out. To display all available measures, set the measure filter to "%". For example, to show only measures that include the word "unit" at any position in their names, set the measure filter to "%unit%". Test Definition ID Identifier of the test definition for which you want to view the report.

General Report Information


Lists overview information like the name of the current project, the report description, and the user who executed the report.

Test Definition Information


Lists general information about the test definition.

SilkPerformer Project Information


Lists general information about the SilkPerformer project that is used to perform the load test.

Page Time Trend Information


The trend charts show the page time trend over the selected time range for all filtered measures. The minimum, maximum, and average page time curves are shown in each chart. The displayed values in each chart are cut at the selected maximum y-axis value. Related Concepts Performance Trend Reports Average Transaction Busy-Time Trend Report Overall Page-Time Trend Report Overall Transaction Busy-Time Trend Report Custom Measure Trend Report

178

Average Transaction Busy-Time Trend Report


Shows the transaction busy time per transaction for all tests executed for the specified test definition within the specified time range. The performance trends of the transaction busy-times for the tested transaction are displayed in trend charts.

Input Parameters
The input parameters for an Average Transaction Busy-Time Trend report are: Date From (DD-MON-YYYY) Starting date for the time range. For example: 06-DEZ-2008. Date To (DD-MON-YYYY) End date for the time range. For example: 16-JAN-2009. Exclude Runs with more than <nnn> Errors Runs that generate more errors than specified here are not included in the report. Use this setting to avoid that outliers skew the trend curve. Maximum Value for y-Axis Limits the y-axis of the graph to the specified value. Transaction busy-times that exceed this value are cut off at the top. This setting is useful to prevent the flattening of lines caused by outliers. Test Definition ID Identifier of the test definition for which you want to view the report. Transaction Filter Shown transactions are limited to those including the specified string in their name. This field has to be filled out. To display all available transactions, set the transaction filter to "%". For example, to show only transactions that include the word "unit" at any position in their names, set the transaction filter to "%unit%".

General Report Information


Lists overview information like the name of the current project, the report description, and the user who executed the report.

Test Definition Information


Lists general information about the test definition.

SilkPerformer Project Information


Lists general information about the SilkPerformer project that is used to perform the load test.

Transaction Busy-Time Trend Information


The trend charts show the transaction busy-time trend over the selected time range for all filtered transactions. The minimum, maximum, and average transaction busy-time curves are shown in each chart. The displayed values in each chart are cut at the selected maximum y-axis value.

179

Related Concepts Performance Trend Reports Average Page-Time Trend Report Overall Page-Time Trend Report Overall Transaction Busy-Time Trend Report Custom Measure Trend Report

180

Custom Measure Trend Report


Shows the average, minimum, and maximum values of the defined measure or measures for all tests executed for the specified test definition within the specified time range. The performance trend of the values for each tested measure is shown in a graph.

Input Parameters
The input parameters for an Custom Measure Trend report are: Date From (DD-MON-YYYY) Starting date for the time range. For example: 06-DEZ-2008. Date To (DD-MON-YYYY) End date for the time range. For example: 16-JAN-2009. Exclude Runs with more than <nnn> Errors Runs that generate more errors than specified here are not included in the report. Use this setting to avoid that outliers skew the trend curve. Maximum Value for y-Axis Limits the y-axis of the graph to the specified value. Measures that exceed this value are cut off at the top. This setting is useful to prevent the flattening of lines caused by outliers. Measure Name Name of the custom measure for which you want to view the report. For example: CreateTestDefinition Measure Type Type of the custom measure. For example: Transaction (BusyTime)[s] Test Definition ID Identifier of the test definition for which you want to view the report.

General Report Information


Lists overview information like the name of the current project, the report description, and the user who executed the report.

Test Definition Information


Lists general information about the test definition.

SilkPerformer Project Information


Lists general information about the SilkPerformer project that is used to perform the load test.

Custom Measure Trend Information


The trend chart shows the performance trend over the selected time range for the selected measure. The minimum, maximum, and average measure curves are shown in the chart. The displayed values in the chart are cut at the selected maximum y-axis value.

181

Related Concepts Performance Trend Reports Average Page-Time Trend Report Overall Transaction Busy-Time Trend Report Overall Page-Time Trend Report Related Reference Average Transaction Busy-Time Trend Report

182

Overall Page-Time Trend Report


Shows overall page times, aggregated over all user types, for all tests executed for the specified test definition within the specified time range. The performance trend of the page times for the tested page is shown in a graph.

Input Parameters
The input parameters for an Overall Page-Time Trend report are: Date From (DD-MON-YYYY) Starting date for the time range. For example: 06-DEZ-2008. Date To (DD-MON-YYYY) End date for the time range. For example: 16-JAN-2009. Exclude Runs with more than <nnn> Errors Runs that generate more errors than specified here are not included in the report. Use this setting to avoid that outliers skew the trend curve. Maximum Value for y-Axis Limits the y-axis of the graph to the specified value. Page-times that exceed this value are cut off at the top. This setting is useful to prevent the flattening of lines caused by outliers. Test Definition ID Identifier of the test definition for which you want to view the report.

General Report Information


Lists overview information like the name of the current project, the report description, and the user who executed the report.

Test Definition Information


Lists general information about the test definition.

SilkPerformer Project Information


Lists general information about the SilkPerformer project that is used to perform the load test.

Overall Page-Time Trend Information


The trend chart shows the overall page-time trend over the selected time range for all pages. The minimum, maximum, and average overall page-time curves are shown in the chart. The displayed values in the chart are cut at the selected maximum y-axis value. Related Concepts Performance Trend Reports Average Page-Time Trend Report Overall Transaction Busy-Time Trend Report Custom Measure Trend Report Related Reference Average Transaction Busy-Time Trend Report

183

Overall Transaction Busy-Time Trend Report


Shows overall transaction busy-time, aggregated over all user types, for all tests executed for the specified test definition within the specified time range. The performance trend of the transaction busy-times for the tested transaction is displayed in a trend chart.

Input Parameters
The input parameters for an Overall Transaction Busy-Time Trend report are: Date From (DD-MON-YYYY) Starting date for the time range. For example: 06-DEZ-2008. Date To (DD-MON-YYYY) End date for the time range. For example: 16-JAN-2009. Exclude Runs with more than <nnn> Errors Runs that generate more errors than specified here are not included in the report. Use this setting to avoid that outliers skew the trend curve. Maximum Value for y-Axis Limits the y-axis of the graph to the specified value. Transaction busy-times that exceed this value are cut off at the top. This setting is useful to prevent the flattening of lines caused by outliers. Test Definition ID Identifier of the test definition for which you want to view the report.

General Report Information


Lists overview information like the name of the current project, the report description, and the user who executed the report.

Test Definition Information


Lists general information about the test definition.

SilkPerformer Project Information


Lists general information about the SilkPerformer project that is used to perform the load test.

Overall Transaction Busy-Time Trend Information


The trend chart shows the overall transaction busy-time trend over the selected time range for all transactions. The minimum, maximum, and average transaction busy-time curves are shown in the chart. The displayed values in the chart are cut at the selected maximum y-axis value. Related Concepts Performance Trend Reports Average Page-Time Trend Report Overall Page-Time Trend Report Custom Measure Trend Report Related Reference Average Transaction Busy-Time Trend Report

184

Issues Per Component Report


Test Manager offers one issues-related report. Issues Per Component This report offers an overview of all issues related to each component. In addition to offering basic issue tracking, this report assists in monitoring the overall issue trend for each component. Related Concepts Issue Management Report Generation Related Procedures Analyzing Test Results - Quick Start Task Creating New Reports Customizing Reports with BIRT Generating Reports Managing Reports Related Reference Reports Unit Interface

185

Code-Change Impact Reports


Test Managers code-change impact reports enable you to perform testing-impact analysis, effort analysis, and risk analysis. With Test Manager you can select classes of interest and, by applying report templates, generate reports that help you determine the test impact that changing the selected classes will result in. For selected classes, you can choose from report templates that analyze the test impact of proposed code changes.

Code-Change Impact Report for Test Definitions


This report provides the following information (in columns) per affected test definition:

Unique key: Test definition + Execution definition Project name Test name Test plan hierarchy Execution definition Test type Duration of test Status of test (passed, failed, not executed), cumulative across all runs of build range Last build executed # Times executed for this version # Times passed for this version + # Times failed for this version Coverage index: Methods covered by the test for the specified classes / total methods of specified classes. Time stamp Test creator Test executor (manual tester or execution server)

Code-Change Impact Report for Execution Definitions


This report is valuable as it identifies the execution definitions that need to be re-run following code changes. This report provides the following information (in columns) per affected execution definition:

Project name Execution definition name # Manual tests # Automated tests # Manual tests in coverage path # Automated tests in coverage path Duration of manual tests Duration of automated tests

186

Duration of manual tests in coverage path Duration of automated tests in coverage path

Use Cases for Reports


Following are some typical code-change impact issues that can be addressed with Test Managers reports: Testing impact analysis: You want to know which tests you should run as a result of a specific change to the code.

Select a particular class. Select and execute the Code Change Impact - Test Definitions report. Observe the list of tests that cover the classes that were touched in this version.
Effort analysis: You want to know how many hours of automated and manual testing will be required to properly cover a particular set of changes to the code.

Select a particular class . Select and execute the Code Change Impact - Execution Definitions report. Observe the required time (cost) for automated and manual tests.
Related Concepts Code Coverage Analysis Report Generation Related Procedures Analyzing Test Results - Quick Start Task Creating New Reports Customizing Reports with BIRT Generating Reports Managing Reports Related Reference Reports Unit Interface

187

Code Coverage Analysis


This section explains how to analyze code coverage withSilkCentral Test Manager. SilkCentral Test Manager's Code Analysis unit offers code-coverage data for Java AUT (Applications Under Test), packages, classes, methods, and statements, enabling you to perform test-impact analysis , which determines the tests that should be run in response to specific code changes, and effort analysis, which determines how many hours of automated testing and manual testing are required to adequately cover specific code changes. Each of these code-analysis tasks can be addressed by running pre-configured reports. In This Section Test Manager Code Analysis Test Managers innovative approach to code coverage draws on the relationship between specific tests and the code they test. Enabling Code Analysis for SilkCentral Test Manager Known limitations and prerequisites that must be performed to enable code analysis with Test Manager. Latest Builds and Build Versions When you select a product in the navigation tree, the list of packages and classes with coverage information for the latest covered version is displayed automatically. Results Compilation Once an execution definitions test executions are complete, you can view its results. Code Analysis and the Manual Testing Client Test Managers Manual Testing Client supports running test executions with code analysis information.

188

Test Manager Code Analysis


The goal of conventional code coverage is to deliver information about what code is covered by tests. This approach is typically used to gain code-coverage information for unit tests. Test Managers code-analysis functionality goes well beyond this, delivering data for unit, functional, and load tests (both automated and manual) in managed environments. Code coverage measurements are utilized to track test progress and guide test planning. Test Managers innovative approach to code coverage draws on the relationship between specific tests and the code they test. This approach enables you to perform impact/dependency analysis of code changes from the testing perspective. By helping you to identify the test runs that are most relevant to a specific code change, it also enables you to better optimize your testing. Test Managers code-coverage functionality is provided in the Code Analysis unit (which is accessible by clicking Code Analysis on the workflow bar). The Code Analysis unit features a navigation tree that lists all products that have been created for the selected project. You can drill down into products to select specific versions, and at the deepest level, specific builds. Related Concepts Code Coverage Analysis Related Procedures Analyzing Test Results - Quick Start Task Analyzing Code Coverage Related Reference Code Analysis Unit Interface

189

Enabling Code Analysis for SilkCentral Test Manager


These are the known limitations and prerequisites that must be performed to enable code analysis with Test Manager.

Prerequisites and Known Limitations


Following are known limitations and prerequisites that must be performed to enable code analysis with SilkCentral Test Manager.
White spaces are prohibited Whitespaces cannot be ignored or added when modifying batch files or syntax. Otherwise, java.lang.error messages will be encountered. For example: Please note that unnecessary whitespace within the command line will result in an error on initiation as follows. Exception in thread "main" java.lang.NoClassDefFoundError: .. Directory not empty

You must start the application under test before The application under test must be started before code coverage execution you start SilkCentral Test Manager is kicked off. The AUT must be started independently of SilkCentral Test Manager so that code coverage can hook into it. Java prerequisites To find out which classes and methods of an application under test (AUT) have been invoked within jar files: The file sctmcc.dll must be located in the directory from which the AUT will be executed. This file can be downloaded directly from your Test Manager GUI by selecting Help Tools Code Analysis Instrumentation Library. Test Managers code analysis works with Java versions 1.4, 1.5, and 1.6. However, there is a difference in how the DLL can be loaded when starting up the virtual machine (VM). When starting up the application under test from the command line, the following arguments must be passed to the VM in order to load sctmcc.dll: Java 1.4 -Xrunsctmcc:port=19129,<options> Java 1.5 and 1.6 -agentlib:sctmcc=<options> See the following sections of this topic for more information about finding out which classes and methods of an application have been invoked within jar files.

Java Code Analysis Options


The following options can be used for all supported Java versions:

port=19129 Port of code coverage service. coveragetype="line" Possible values are "line" or "method". "line" must be used for getting code coverage
information with Test Manager.

coveragepath={"library1.jar";library2.jar"} Jar-libraries to monitor for code coverage information. name="ServerName" Name of the monitored application.

190

Java Code Analysis Examples


Java 1.5 and 1.6:

"<java_home_directory>\bin\java" agentlib:sctmcc=port=19129,coveragetype="line",coveragepath={"C:\dev\deploy\lib \library1.jar";"C:\dev\deploy\lib\library2.jar"},name="myAUTserver" com.caucho.server.http.HttpServer

Java 1.4:

C:\Java\j2sdk1.4.2_06\bin\java.exe -Xrunsctmcc:port=19129,coveragetype="line",coveragepath= {"C:\Program Files\Borland\SilkTest\JavaEx\JFC\swingall.jar"; "C:\Program Files\Borland\SilkTest\JavaEx\JFC\Swing11TestApp.jar"},name="Test Application" -Dsilktest.tafont=arialuni.ttf -cp .;%FontDir% ta

You must configure Test Manager to gather code coverage data from an application under test. This can be done for any number of execution definitions listed on Test Managers Deployment tab, in the Execution unit. Related Concepts Code Coverage Analysis Related Procedures Analyzing Test Results - Quick Start Task Analyzing Code Coverage Related Reference Code Analysis Unit Interface

191

Latest Builds and Build Versions


When you select a product in the navigation tree, the list of packages and classes with coverage information for the latest covered version (which implies the latest covered build for the version) is displayed automatically. When you select a product version in the navigation tree, coverage information for the latest covered build of the version is displayed automatically. Note: Code analysis across a range of builds is not currently supported by Test Manager. Related Concepts Code Coverage Analysis Related Procedures Analyzing Test Results - Quick Start Task Analyzing Code Coverage Related Reference Code Analysis Unit Interface

192

Results Compilation
Once an execution definitions test executions are complete, you can view its results. You will notice that there is a new result file for the execution definition called FullCoverageInfo.xml and an additional CodeCoverageInfo.xml file for each test definition result. Test Manager uses these result files to aggregate and calculate all code analysis data. Note: Aggregated data is not immediately available and calculations may take time to compile. Related Concepts Code Coverage Analysis Related Procedures Analyzing Test Results - Quick Start Task Analyzing Code Coverage Related Reference Code Analysis Unit Interface

193

Code Analysis and the Manual Testing Client


Test Managers Manual Testing Client supports running test executions with code analysis information. If a test execution has already been enabled to gather code analysis information within Test Manager, the settings are automatically available in the Manual Testing Client. If you want to enable code analysis for an execution definition from within the Manual Testing Client though, please refer to the related procedure below. Related Concepts Code Coverage Analysis Related Procedures Analyzing Test Results - Quick Start Task Analyzing Code Coverage Enabling Code Analysis Within the Manual Testing Client Related Reference Code Analysis Unit Interface

194

195

Procedures
This section explains all of the procedures associated with using Test Manager. In This Section Quick Start Tasks Quick Start Tasks are high-level overviews of the main tasks that you will likely need to perform with SilkCentral Test Manager. Managing a Successful Test This section explains all of the procedures that you need to know to manage tests with Test Manager.

196

Quick Start Tasks


Quick Start Tasks are high-level overviews of the main tasks that you will likely need to perform with SilkCentral Test Manager. These procedures can serve as tutorials in guiding you step-by-step through best practice use of Test Manager's core functionality. In This Section Analyzing Test Results - Quick Start Task How to analyze test results in SilkCentral Test Manager. Configuring Projects - Quick Start Task How to configure projects in SilkCentral Test Manager. Managing Requirements - Quick Start Task How to manage requirements in SilkCentral Test Manager. Managing Test Executions - Quick Start Task How to manage test executions in SilkCentral Test Manager. Managing Test Plans - Quick Start Task How to manage test plans in SilkCentral Test Manager.

197

Analyzing Test Results - Quick Start Task


This quick start task details the tasks that are involved in analyzing test results in SilkCentral Test Manager.

To analyze test results:


1

Create a new report. Creating New Reports Edit your report's properties. Editing Report Properties Edit your report's parameters. Editing Report Parameters Optionally, write advanced SQL queries for your report. Writing Advanced Queries with SQL Optionally, customize a report template to meet your needs. Customizing BIRT Report Templates Add subreports to your report. Adding Subreports Generate your report for viewing. Viewing Reports Generate a chart for viewing. Displaying Charts Optionally, generate a code-change impact report. Generating Code-Change Impact Reports

Related Concepts New Reports Requirements Reports Test Plan Reports Report Generation Code Coverage Analysis Execution Reports Related Procedures Creating Reports Generating Reports Managing Reports

198

Creating New Reports


To create a new report:
1

Click Reports on the workflow bar. In the Reports directory tree, select the folder in which you want the new report to appear. This determines where the report will be stored in the directory tree. Click New Child Report on the toolbar. On the Create New Report dialog box, enter the name of the new report. This is the name that will appear in the Reports tree. Check the Share this report with other users check box if you want to make this report available to other users.

3 4 5

In the Timeout [s] field, specify the maximum time period in seconds that Test Manager should wait for SQL queries to complete. From the Default tab list box, select the tab that you want to be directed to when you select this report from one of the context-sensitive report lists. Select the corresponding result type from the Result Category list box. This setting specifies the database table and view that is to be filtered for the report. The following result types are available:

Requirement - Returns requirements available in the requirements module that meet the query criteria. Test Definition - Returns test definitions available in the Test Plan module that meet the query
criteria.

Test Definition Execution - Returns executed test definition results from the Executions module
that meet the query criteria.

Execution Definition - Returns execution definitions from the execution module. Issue - Returns issues (including imported issues). Requirement Progress Builds - Contains information on requirements progress per build so that
you can see how requirements develop across builds. on a daily basis.

Requirement Progress Days - The same as Requirement Progress Builds, but shows development Test Definition Progress Builds - Shows how test definitions develop across builds. Test Definition Progress Days - Same as above, but shows development on a daily basis.
Each result type offers a set of selection criteria. Based on the Result Type you have selected, specify an appropriate Selection Criteria for your report. These criteria typically group properties based on a view or some other intuitive grouping (for example, custom properties).
6 7

From the Property list box, select the property that is to be filtered on. For some selection criteria, properties are dynamic. Select an Operator for the query. The available operators depend on the property. Example operators are =, not, like, not like. Strings are always compared lowercase. Allowed wildcards for strings are * and ? (where * matches any characters and ? matches exactly one character). Select or specify the Value that the query is to be filtered on. For date-based properties, the Value field is replaced with a calendar tool that you can use to select a specific date. (optional) To add an additional query string to this report, click More. An existing query string can be deleted by clicking the strings Delete button. When multiple query strings are defined, AND and OR radio buttons appear next to the More button. Use these option buttons to define if the queries should be considered cumulatively (ALL), or if only one query strings criteria needs to be met (OR).

199

Click Next to configure report columns on the New Report dialog box.

To create columns:
1

Click Add Columns to display the Add Columns dialog box. All available report columns are listed. Select those that you want to have included in the report and click OK (multiple columns can be selected by holding down the CTRL key). Note: For test-planning reports, the list of available column names is enhanced with the column names from the LQM_v_tests table. See SilkCentral database documentation for full details.

The selected columns appear in tabular format on the New Report dialog box. From here you can configure how each report column is to be displayed. For each column, specify a sort direction (ascending, descending, or unsorted) using the up/down arrows in the Sorting column. When a column is selected for sorting, a list box is displayed in the Sort Order column that allows you to more easily edit the column-sort order. Set these numbers as required. Give each column an Alias. This is the name by which each column will be labeled in the generated report. With grouping, you can take advantage of SQL aggregation features (for example, selecting a number of elements or querying a total sum of values). Check the Group by check box on the column selection dialog to specify that SQL group by functions are to be applied. Columns that are not selected for SQL group by functions are set to aggregation by default (meaning, a single aggregate value will be calculated). From the Aggregation list box, select the appropriate aggregation type (Count, Sum, Average, Minimum, or Maximum). The Actions column enables you to move column listings up and down in the view. The Move Up and Move Down functions do not affect the outcome of the report. Note: Any report column can be deleted by clicking the columns Delete button.

3 4 5

Click Finish to complete your new report.

Related Concepts New Reports Report Generation Related Procedures Analyzing Test Results - Quick Start Task Creating Reports Customizing Reports with BIRT Generating Reports Using Context-Sensitive Reports Managing Reports Related Reference Reports Unit Interface

200

Editing Report Properties


To edit the properties of a report
1 2 3 4 5 6 7

Click Reports on the workflow bar. Select the report in the Reports tree. On the Properties tab, click Edit. The Edit Report dialog box displays. Modify the Name and Description of the report as required. Ensure that the Share this report with other users check box is checked if you intend to have this report shared with other users. From the Default tab list box, select the tab that you want to be directed to when you select this report from one of the context-sensitive report lists. Specify one of the following options to indicate how the report can be edited:

Simple report: You can modify the Selection criteriathus changing the results of the selected report
or you can click Advanced Query to modify the SQL query code.

Advanced report: If you have familiarity with SQL, you may edit the query code in the Report data

query field. To assist you in editing SQL queries, a list box of function placeholders (for example, variables) is available. To insert one of the available pre-defined functions, select the corresponding placeholder from the Insert placeholder list box. Note: If you manually edit the SQL code for the query, upon finishing, click Check SQL to confirm your work.

Click Finish to save your changes.

Related Concepts Report Generation Related Procedures Creating New Reports Customizing BIRT Report Templates Creating Reports Generating Reports Managing Reports Related Reference Report Properties tab

201

Editing Report Parameters


To edit report parameters
1 2 3 4 5 6 7

Click Reports on the workflow bar. Select a report in the Reports tree. Click the Parameters tab. If the report has parameters defined for it, the parameters will be listed there. Click Edit Parameters. The Edit Parameters dialog box displays. Edit the Label or Value of the listed parameters as required. From the Usage field, select the usage type of the parameter (constant value, start time, end time). Click OK to save your changes.

Related Concepts Report Generation Related Procedures Creating New Reports Creating Reports Generating Reports Managing Reports Related Reference Report Parameters tab

202

Writing Advanced Queries with SQL


Advanced reports can be created through manual SQL coding. Virtually any reporting option is available if you know the database schema. Clicking Advanced hides the query string list boxes explained in the section above and opens a Report data query field in which you can insert existing code or write new SQL code. One approach is to begin query-string construction using the list boxes as outlined above (if the report criteria are valid, the equivalent SQL statement will be generated and displayed), and then move to advanced mode for further modifications. Note: Note that you cannot move from advanced mode back to simple mode.

To write an advanced query directly in SQL


1 2 3 4 5 6 7

Click Reports on the workflow bar. In the Reports directory tree, select the folder in which you want the new report to appear (Requirements, Test Plan, Issues, etc.). This determines where the report will be stored in the directory tree. Click Create New Report on the toolbar. On the Create New Report dialog box, enter the name of the new report. This is the name that will appear in the Reports tree. Check the Share this report with other users check box if you want to make this report available to other users. Enter a description of the report in the Description field. Click Advanced to open the Report data query field. Insert previously written code as necessary, or write new code directly in the field. To assist you in writing SQL queries, a list box of Test Manager function placeholders is available. See the following section for details regarding available placeholders. To insert one of the available pre-defined functions, select the corresponding placeholder from the Insert placeholder list box. Note: If you manually edit SQL code for the query, click Check SQL to confirm your work.

Once you have completed editing the reports properties, click Finish to save your settings.

Related Concepts New Reports Report Generation Related Procedures Analyzing Test Results - Quick Start Task Creating Reports Customizing Reports with BIRT Generating Reports Using Context-Sensitive Reports Managing Reports Related Reference Reports Unit Interface

203

Customizing BIRT Report Templates


With BIRT RCP Designer, you can customize Test Managers pre-installed report templates and create custom report templates; see SilkCentral Administration Module documentation and BIRT RCP Designer help for details. Modified report templates can be uploaded using the Upload link on the Report tab.

To download an existing template for editing:


1 2 3 4 5

Select a report that utilizes the BIRT Report Template from Test Manager Select the Properties tab. Click Download BIRT Report Template.

Reports in the menu tree.

You receive the report data as a generic BIRT report template (empty). The datasource is already configured. Once you have saved the template to your local system, modify it as required. Once complete, upload it using the Upload link on the Report tab. For detailed information on configuring BIRT report templates, please refer to the SilkCentral Administration Module Help.

Related Concepts New Reports Report Generation Related Procedures Analyzing Test Results - Quick Start Task Creating Reports Customizing Reports with BIRT Generating Reports Managing Reports Related Reference Report Properties tab

204

Adding Subreports
To aggregate the results from multiple reports into the currently selected report, you can add subreports. When adding a report as a subreport, the result columns and rows of the subreport are concatenated to the results of the selected report.

To add a report as a subreport


1 2 3 4 5

Click Reports on the workflow bar. Select a report in the Reports tree. On the Properties tab, click Add Subreport. The Add Subreport dialog box displays. Select the subreport you want to have appended to the current report by selecting it from the Reports tree-list. Click OK to complete the addition of the subreport. Subreports appear on the associated reports Properties tab in a section called Subreports.

Related Concepts Report Generation Related Procedures Creating New Reports Customizing BIRT Report Templates Creating Reports Generating Reports Managing Reports Related Reference Report tab

205

Viewing Reports
Because each template expects a certain data format to produce a useful graph, not all templates can be applied to all report queries. You will receive an error message if you attempt to generate a report through an incompatible report template. For example, selecting the Four Values Per Row As Horizontal Bar template to display the Requirements Status Overview report works because this particular Microsoft Excel template requires exactly the four values (failed, passed, not executed, and not covered) that the report query delivers.

To generate a report
1 2 3 4 5 6 7

Click Reports on the workflow bar. In the Reports tree, select the report you want to generate. Select the Report tab. Click the Select Report Template icon. From the Select Report Template dialog box, select the template you wish to use. Click OK to display the report. (optional) If necessary, select an alternate view magnification for the report from the list box. 100% is the default magnification. Other options are 50%, 75%, 150%, and 200%.

Related Concepts Report Generation Related Procedures Analyzing Test Results - Quick Start Task Context-Sensitive Reports Generating Reports Managing Reports Related Reference Report tab

206

Displaying Charts
To display a chart
1 2 3 4 5 6 7

Click Reports on the workflow bar. Select a report in the Reports tree for which you want to view a chart. Select the Chart tab to display the default chart. To select a required chart type, click the Select Chart Type icon. On the Select Chart Type dialog box, select a chart type. Select the view properties that you want to apply to the chart (3D view, Show horizontal grid lines, Show vertical grid lines, and Show legend). Specify how these chart options are to be saved:

Select For current user only to have these chart settings override the reports standard settings whenever
the current user views this chart.

Select As report standard to have these chart settings presented to all users who dont have overriding
user settings defined. This setting does not affect individual user settings.
8

Click OK to display the new chart type. Note: Note: The chart configurations you define here become the defaults for this report. When standard charts and graphs are not able to deliver the specific data that you require, or when they cannot display data in a required format, you can customize the appearance of queried data using the Test Manager reporting functionality. To open the current chart in a separate browser window, click the Open in new window icon at the top of the Chart tab.

Note:

Related Concepts Report Generation Related Procedures Customizing BIRT Report Templates Displaying Charts Creating Reports Generating Reports Managing Reports Related Reference Report Chart tab

207

Generating Code-Change Impact Reports


To generate a code-change impact report
1 2 3 4 5

Click Projects on the workflow bar. Select the project for which you want to analyze code-coverage data. Click Code Analysis on the workflow bar. Click Create Code Change Impact Report on the main toolbar. The Select Classes for Report dialog box displays, select a Product and Version, if you want to change the pre-selected values. In the Filter field, enter criteria to filter the packages. For example, entering the string published will only list packages that contain the string published in their names. Select a package from the Packages pick list. You can select multiple packages by holding down the CTRL key while clicking listed packages. The classes that are available in the selected package appear in the Classes pick list. Select a class file that you want to have included as a source in your report. You can select multiple classes by holding down the CTRL key while clicking listed classes. Click Add to add the class file(s) to the Selected classes pick list. You can remove classes in the Selected classes pick list by selecting entries and clicking Remove. Click Remove All to remove all selected classes from the Selected classes pick list.

6 7

Repeat the preceding steps Select a package from the Packages pick list through Click Add to add the class file(s) to the Selected classes pick list until you have added all required classes to the Selected classes list. Select a report from the Select report list box.

Related Concepts Code-Change Impact Reports Report Generation Test Manager Code Analysis Code Coverage Analysis Related Procedures Analyzing Test Results - Quick Start Task Enabling Code Analysis for Execution Definitions Analyzing Code Coverage Related Reference Code Analysis Unit Interface

208

Configuring Projects - Quick Start Task


This quick start task details the tasks that are involved in configuring projects in SilkCentral Test Manager. Note: Note that only your system administrator can create a new project for your use.

To configure project settings:


1

Configure settings for your project. Configuring Project Settings Create custom attributes. Creating Custom Attributes Create global filters. Creating Global Filters Configure change notification. Enabling Change Notification Create custom step properties. Creating Custom Step Properties

Related Concepts Settings Configuration Related Procedures Configuring Test Manager Settings

209

Configuring Project Settings


To customize project settings:
1

Navigate to Test Manager Note:

Settings .

If you have not already selected a project, a warning message will appear, asking you to select a project. Select the project for which you want to define global settings.

2 3 4

Select the Project Settings tab to view the current settings. The Project Settings page displays the current project settings. Click Edit to modify the current project settings. The Edit Project Settings dialog box displays. You can specify the following information:

Build Information File Name Build information files contain project information, including build number, Project Release Date Enter your projects planned release date in the format MM/DD/YYYY. File Extensions to ignore in Results Specify result file types or other file types that should not be
saved as results for test executions. Note: Note: File extensions must be separated by commas (for example, xlg, *_, res). Changes made in the Build Information File Name and File Extensions to ignore in Results fields will not affect scheduled test definitions. To redistribute tasks to execution servers, you must reschedule test definitions, or disconnect from and reconnect to the database.

build log location, error log location, and build location. Enter the name of your projects build information file in this field. All test executions will read the build information from this specified file.

Click Save to save your project settings.

Related Concepts Settings Configuration Related Procedures Configuring Projects - Quick Start Task Configuring Test Manager Settings Related Reference Settings Unit Interface

210

Creating Custom Attributes


To create a custom attribute
1

Navigate to Test Manager Note:

Settings.

If you have not already selected a project, a warning message displays, asking you to select a project. Select the project for which you are defining custom attributes.

2 3 4 5 6 7

Select the Attributes tab to view the list of current attributes. Click New Attribute. The New Attribute dialog box displays. Enter a Name for the new attribute. This name will be displayed in list boxes when the attribute becomes available for use. Enter a Description of the new attribute. Select the Attribute type. Depending on the attribute type you have selected, you can now continue as follows: If you have selected the attribute type Edit, you can now click OK to save the new custom attribute, or click Cancel to abort the operation. If you have selected the attribute type Normal or Set, you can define values. To define a new value, click New Value. Enter the value into the Value field on the New Value dialog box and click OK. The new value is then listed in the Value table, where you can edit it by clicking the name of the value; or you can delete it by clicking on the Delete icon. Click OK to save the new attribute; or click Cancel to abort the operation. You will be returned to the Attributes list; the new attribute is now listed.

Related Concepts Attributes Settings Configuration Related Procedures Configuring Projects - Quick Start Task Configuring Test Manager Settings Related Reference Attributes tab

211

Creating Global Filters


To create a global filter
1

Navigate to Test Manager Note:

Settings .

If you have not already selected a project, a warning message will appear, asking you to select a project. Select the project for which you are defining global settings.

2 3 4 5

Select the Filters tab to view the list of available filters. Click New Filter. The New Filter dialog box displays. Enter a Name for the new filter. This name will be displayed in list boxes when the filter becomes available. Select a Category for the new filter from the list box to make the filter available in a specific Test Manager unit:

Requirement Filter The filter will be available in the Requirements Management unit. Test Definition Filter The filter will be available in the Test Plan Management unit. Execution The filter will be available in the Test Execution Management unit.
6 7

Enter a Description of the new filter. Select a category of filter criteria (Selection criteria). The available categories depend on the general filter category you have selected. You can also combine filters by selecting Nested Test Definition Filter or Nested Requirements Filter. Selecting one of these categories allows you to include an existing test definition filter (for example, an existing requirements filter) in your new filter.

Select a Property, Operator, and Value for the new filter from the respective list boxes.

Property Available properties depend on the filter category that you have selected in the previous step.
It defines the property for which you are defining a filter setting. If you have selected an attribute category, the property list includes custom attributes to query against.

Operator Specifies the filter operator. The operator depends on the property type you have selected.
For example, if you have selected a property that is based on a string field type, the available operators are = (equals defined value) not (differs from the defined value) contains (contains the defined value somewhere in the string) not contains (does not contain the defined value in the string) values will either be strings that you can enter into the text box, or a selection of predefined values that you can select from the list box.

Value Enter the value that you want to filter out. Depending on the property type that you have selected,

Click More if you want to add more than one filter category to the new filter. Repeat this procedure to define new categories. If you define more than one filter category, you must define whether the categories need to be fulfilled in addition to the existing categories (AND relationship), or if the filter returns true when any of the filter categories are fulfilled (OR relationship). Select either AND or OR to define the filter category relationship. You cant define nested AND, OR relationships. To remove filter categories, click Fewer. This removes the last filter category. When you are done, click OK to save the new filter, or click Cancel to abort the operation.

212

Related Concepts Global Filters Settings Configuration Related Procedures Configuring Projects - Quick Start Task Configuring Test Manager Settings Creating Filters Related Reference Filters tab

213

Enabling Change Notification


To enable change notification
1 2 3 4 5 6 7

Navigate to Test Manager Select the Notifications tab.

Settings.

Click Configure Notification to open the Configure Change Notification dialog box. If you want to be notified by email when changes are made to requirements in the currently selected project, check the Changes on Requirements check box. If you want to be notified by email when changes are made to test plans within the currently selected project, check the Changes on Test Plan check box. Click OK to save the notification settings, or click Cancel to abort the operation without saving changes. You will be notified by email about the changes you have activated.

Related Concepts Change Notification Settings Configuration Related Procedures Configuring Projects - Quick Start Task Configuring Test Manager Settings Related Reference Notifications Page

214

Creating Custom Step Properties


To create a new custom step property
1 2 3 4

Navigate to Test Manager

Settings .

Select the Step Properties tab. Click New Property to display the New Custom Step Property dialog box. Enter a name for the new property in the Name field. Note: Custom step property fields are always declared as type string.

Click OK to make your custom property available to all manual test steps in the selected Test Manager project.

Related Concepts Settings Configuration Related Procedures Configuring Projects - Quick Start Task Custom Step Properties Configuring Test Manager Settings Related Reference Step Properties Page

215

Managing Requirements - Quick Start Task


This quick start task details the tasks that are involved in managing requirements in SilkCentral Test Manager.

To manage requirements:
1

If you are using an external requirements-management tool, configure integration for the tool. Note: See related procedures below for information on configuring integration with your requirements-management tool.

Create your requirements. Creating Requirements If you have integrated a requirements-management tool, configure the requirement type of your requirements. Configuring Requirement Types Attach files to your requirements. Attaching a File to a Requirement Create custom filters for your requirements. Creating Filters Create advenced custom filters for your requirements. Creating Advanced Filters Generate a test plan from your requirements. Generating Test Plans from Requirements View

Related Concepts Requirements Management Related Procedures Integrating External RM Tools Enabling Integration with Borland CaliberRM Enabling Integration with IBM Rational RequisitePro Enabling Integration with Telelogic DOORS Managing Requirements

216

Creating Requirements
Test Manager allows you to create new requirements, edit and delete existing requirements, and add custom property fields to requirements. Newly created Test Manager projects do not contain requirements.

To create a new requirement


1 2

Navigate to Test Manager

Requirements.

Click New Requirement on the toolbar. Note: If the project you are working with does not yet have any requirements associated with it, click the <Click here to add Requirements> link in the Requirements tree to open the New Requirement dialog box.

On the New Requirement dialog box, enter a meaningful Name and Description for the requirement. Note: Test Manager supports HTML formatting and cutting and pasting of HTML content for description fields.

4 5

Select the appropriate Priority, Risk, and Reviewed status from the list boxes. If custom requirements habe been defined, enter in the Custom Property text box any custom property data that you want tracked with this requirement. Note: The Priority, Risk, Reviewed, and any Custom Property fields will be configured automatically with the corresponding properties of the parent requirement if you check the Inherit from parent check boxes for these properties.

Click OK to create a new top-level requirement. Note: Alternatively, you can click OK and New Requirement to both save the newly created requirement and open the New Requirement dialog box to create an additional top-level requirement. Or, you can click OK and New Child Requirement to have the New Child Requirement dialog box open after the new top-level requirement is created.

Related Concepts Requirements Management Related Procedures Managing Requirements - Quick Start Task Creating Child Requirements Managing Requirements Related Reference Requirements Unit Interface HTML Support for Description Text Boxes

217

Configuring Requirement Types


To configure requirement type
1

Click Requirements on the workflow bar. Note: Configuration of requirement type for CaliberRM, Requisite Pro and DOORS is only enabled for top-level requirements in the tree (requirements that are a direct child of the project node). All other requirements share the requirement type of their parents. A requirement without a configured requirement type is not available for upload. Import of requirements automatically configures appropriate requirement type.

2 3

From Requirements View, at the requirement level, select the Properties tab. Click Map Requirement to select a requirement type from the list. Requirement type is a categorization used by CaliberRM, Requisite Pro, and DOORS and is required for synchronization. Note: Map Requirement is only enabled when external requirements integration is enabled in the Settings unit (Integrations Configuration tab.) and if the requirement has not yet been uploaded to the external requirements management tool. Additionally, the option Enable upload of requirements to... must be enabled.

Click OK to save your settings and close the dialog box.

Related Concepts Requirements Management Related Procedures Managing Requirements - Quick Start Task Test Coverage Status Managing Requirements Related Reference Requirement Properties tab

218

Attaching a File to a Requirement


To attach a file to a requirement
1 2 3

Click Requirements on the workflow bar. Select a requirement in the Requirement tree view. Select the Attachments tab. When requirements management integration has been enabled between a Test Manager project and a CaliberRM project, the Attachments tab (Requirements Attachments) includes an Open CaliberRM button, which enables you to manage requirement attachments directly in CaliberRM.

4 5 6 7

Click Upload File to open the Upload File dialog box. Using Browse, select a file from your local file system. Enter a meaningful Description for the attachment. Click OK to upload the attachment to the server and associate it with the selected requirement.

Note: Attaching files to a test plan element may not work in Mozilla Firefox. Firefox requires usage of three slashes (for example: "file:///") for a file link, while other browsers require only two (for example: "file://"). Aditionally, Firefox includes a security feature blocking links from remote files to local files and directories. For more information, see http://kb.mozillazine.org/Firefox_:_Issues_:_Links_to_Local_Pages_Don't_Work Related Concepts Attachments Requirements Management Related Procedures Managing Requirements - Quick Start Task Managing Requirements Related Reference Requirement Attachments tab

219

Creating Filters
To create a new custom filter:
1 2 3 4

Click the appropriate button (Execution, Requirements, or Test Plan) on the workflow bar. Click New Filter on the toolbar to display the New Filter dialog box. From the Property list box, select the property on which you wish to base the new filter (for example, Name, Description, Priority, Version and Build). From the Operator list box, select a logical operator to be applied to the specified property (for example, =, not, >, >=, <, <=, contains, anddoes not contains). Note: The contents of the Operator and Value list boxes vary based on the attribute selected in the Property field.

In the Value field, enter the value that the specified property is to be compared against. Note: For date-based properties, the Value field is replaced with a calendar tool that you can use to select a specific date.

6 7 8 9

Click Save and apply to open the Edit Filter dialog box. To apply the filter to the current view without saving the filter settings, click Apply. On the Edit Filter dialog box, enter a name for the filter in the Name field. Enter a meaningful description for the filter in the Filter field. Click OK to save the filter with your project.

Related Concepts Filtering Related Procedures Creating Advanced Filters Creating Global Filters Working with Filters

220

Creating Advanced Filters


Advanced custom filters enable you to combine simple filters to create complex filters that apply multiple filter criteria simultaneously.

To create an advanced custom filter


1 2 3 4

Click the appropriate button (Execution, Requirements, or Test Plan) on the workflow bar. Create a new custom filter. After you have defined your first filtering rule, click Advanced to open the Edit Filter dialog box.

Enter a name for the filter in the Name field. 5 Give the filter a meaningful Description.
6 7 8 9

Click More to display a second set of filter-parameter fields with which you can define a second set of filter parameters. Select a logical operator for the application of the filtering queries. For example, filtered elements must meet both sets of criteria (and), or filtered elements must meet one, but not both, of the criteria sets (or). To delete a filter-parameter string, click the corresponding Delete button. To display additional filter-parameter fields and create additional filter queries, click More. To remove excess filter-parameter sets, click Fewer.

Related Concepts Filtering Related Procedures Creating Filters Creating Global Filters Working with Filters

221

Generating Test Plans from Requirements View


To generate a new test plan from Requirements View:
1

Click Requirements on the workflow bar. From Requirements View, with at least one requirement available in the Requirements tree, right-click the requirement or project node that is to be converted into a Test Plan tree. Select Generate Test Plan to display the Generate Test Plan from Requirements dialog box. This dialog box enables you to specify whether the leaves (lowest-level nodes) of the selected requirements subtree should be converted into test definitions or test folders; and whether the tree should be generated into a new test container or an existing container. Enter a name for the new test container in the Enter Name field and select a product from the Select Product list box to create the container within the active Test Manager project. The Select Product list box is populated with the products that are configured by a project manager. See SilkCentral Administration Module documentation or ask your project manager for detailed information. If you have defined a source control profile (see SilkCentral Administration Module documentation or ask your Test Manager administrator for detailed information) select the source control profile you want to use for managing the test definition sources from the Select Source Control Profile list box. To include all child requirements of the selected requirement in the test plan, check the Include child requirements check box (the default). To have the new test definitions that you generate automatically assigned to the requirements from which they are created, check the Assign newly generated Test Definitions to Requirements check box. If this option is not selected, test definitions must be manually associated with requirements. Note: This option is not available when checking Generate Test Folders from Requirement Tree leaves.

2 3

4 5 6

7 8

Click OK to create the test plan, which has the same structure as the Requirements tree. A message displays, asking if you want to switch directly to the Test Plan unit. Click Yes to view the test plan in Test Managers Test Plan unit, or click No to remain in the Requirements unit.

Related Concepts Test Plan Generation Requirements Management Related Procedures Managing Requirements - Quick Start Task Managing Requirements Related Reference Requirements Unit Interface

222

Managing Test Executions - Quick Start Task


This quick start task details the tasks that are involved in managing test executions in SilkCentral Test Manager.

To manage test executions:


1

Create your execution definitions. Adding Execution Definitions Manually assign test definitions to your execution definitions. Manually Assigning Test Definitions to Execution Definitions Assign test definitions from Grid View to your execution definitions. Assign Test Definitions from Grid View to Execution Definitions Assign test definitions to your execution definitions using a filter. Using a Filter to Assign Test Definitions to Execution Definitions Create execution schedules for your execution definitions. Creating a Custom Schedule for an Execution Definition Configure setup and cleanup execution definitions. Configuring Setup and Cleanup Executions Add execution dependencies. Adding Dependent Execution Definitions Assign execution servers to your execution definitions using hardware-provisioning keywords. Assigning Keywords to Execution Definitions Execute your tests. Executing Individual Tests

10 View results of your test executions.

Viewing Test Execution Details Related Concepts Execution Definitions Execution Dependency Configuration Setup and Cleanup Test Definitions Test Definition Execution Related Procedures Manual Test Execution Executing Manual Tests Working with SilkPerformer Projects

223

Adding Execution Definitions


To add an execution definition
1 2 3 4

Click Execution on the workflow bar. Select an existing folder in the Execution tree, or select the project node. Click New Execution Definition on the toolbar (or right-click within the Execution tree and choose New Child Execution Definition ). The New Execution Definition dialog box displays. Enter a name and meaningful description for the execution definition. Note: Test Manager supports HTML formatting and cutting and pasting of HTML content for Description fields.

Select a test container from the Test Container list box. The Version and Build that are associated with the product that the container is associated with are then populated automatically in the Version and Build fields. You may only associate one test container to a test execution. Select a product Version and Build from the list boxes. If a build information file is available on the execution server, you have the option to check the Read from Build Information file check box, in which case build and information will be read from the build information file for the test run, overriding any manual settings that have been selected on the New Execution Definition dialog box. Specify a Priority for the execution definition from the list box (Low, Normal, or High). In the Source Control Label field you can optionally specify that the execution definition be of an earlier version than the latest version. Click OK to update the Execution tree with the newly created execution definition.

7 8 9

Related Concepts Test Definition Execution Execution Definition Schedules Build Information Related Procedures Managing Test Executions - Quick Start Task Execution Definitions Working with Execution Definitions Executing Test Definitions Creating an Execution Definition in Grid View Related Reference Execution Unit Interface HTML Support for Description Text Boxes

224

Manually Assigning Test Definitions to Execution Definitions


The test definitions that are assigned to the selected execution definition are listed on the Assigned Test Definitions tab (Execution View only). The properties of the assigned test definitions that are listed in tabular view include:

Test Definition Name Test Definition Status Last Execution of the test definition

To manually assign test definitions to an execution definition


1 2 3 4

Click Execution on the workflow bar. Select the execution definition to which you are assigning test definitions. Select the Assigned Test Definitions tab. All test definitions of the test container which is associated with the selected execution are displayed in the Available Test Definitions window. Click the assign arrow of any test definition that you want to assign to the currently selected execution definition. Clicking the assign arrow of a folder or the top-level container assigns all child test definitions of that parent to the selected execution definition. Click Apply to save the assigned test-definition list. Note: If you do not click Apply, changes you make to the Assigned Test Definitions will be lost.

Related Concepts Test Definition Execution Execution Definition Schedules Related Procedures Managing Test Executions - Quick Start Task Adding Execution Definitions Executing Test Definitions Assign Test Definitions from Grid View to Execution Definitions Using a Filter to Assign Test Definitions to Execution Definitions Related Reference Execution Assigned Test Definitions Tab

225

Assign Test Definitions from Grid View to Execution Definitions


The test definitions that are assigned to the selected execution definitions are listed on the Assigned Test Definitions tab (Execution View only). The properties of the assigned test definitions that are listed in tabular view include:

Test Definition Name Test Definition Status Last Execution of the test definition

To assign one or more test definitions from the test plan Grid View to one or more execution definitions:
1 2 3 4 5 6 7 8

Click Test Plan on the workflow bar. Click Grid View on the toolbar Select the test definitions you want to assign to execution definitions. You can use your keyboard's Ctrl and Shift keys to select multiple test definitions using standard browser multi-select functions. Right-click the selected test definitions and choose Save Selection. Click Execution on the workflow bar. Select the execution definition to which you want to assign the selected test definitions. Choose Assigned Test Definitions. Click Assign Saved Selection. Note: Note: Only test definitions that reside in the execution definitions test container are inserted. You can insert the selected test definitions to more than one execution definitions. You can not insert them into requirements in a different project. The selection persists until you make a different selection or close Test Manager.

Related Concepts Test Definition Execution Execution Definition Schedules Related Procedures Managing Test Executions - Quick Start Task Adding Execution Definitions Executing Test Definitions Using a Filter to Assign Test Definitions to Execution Definitions Manually Assigning Test Definitions to Execution Definitions Related Reference Execution Assigned Test Definitions Tab

226

Using a Filter to Assign Test Definitions to Execution Definitions


The test definitions that are assigned to the selected execution definition are listed on the Assigned Test Definitions tab (Execution View only). The properties of the assigned test definitions that are listed in tabular view include:

Test Definition Name Test Definition Status Last Execution of the test definition

To use a filter to assign test definitions to an execution definition:


1 2 3 4 5 6 7

Create a filter in the Test Plan unit. Refer to the Creating Filters procedure for details. If the filter already exists, skip this step. Click Execution on the workflow bar. Select the execution definition to which you are assigning test definitions. Select the Assigned Test Definitions tab. All test definitions of the test container which is associated with the selected execution are displayed in the Available Test Definitions window. Select By Filter from the test definition assignment types. Choose the filter from the list box. Click Apply to save the assigned test-definition list. Note: If you do not click Apply, changes you make to the Assigned Test Definitions will be lost.

If you assign test definitions to an execution definition in Test Plan Grid View , the test definition assignment type is automatically set to Manual, but the previously-filtered test definitions remain in the Assigned Test Definitions tab.

Related Concepts Test Definition Execution Execution Definition Schedules Related Procedures Managing Test Executions - Quick Start Task Adding Execution Definitions Executing Test Definitions Creating Filters Assign Test Definitions from Grid View to Execution Definitions Manually Assigning Test Definitions to Execution Definitions Related Reference Execution Assigned Test Definitions Tab

227

Creating a Custom Schedule for an Execution Definition


To create a custom schedule for a selected execution definition:
1 2

Click Execution on the workflow bar. Select an execution definition for which you want to configure a custom schedule. Note: Note: To schedule a folder for execution, select a folder node. To save an edited version of a global schedule as a custom schedule, click Edit while the global schedule is selected in the list box. This enables you to edit the global schedule and save the result as a custom schedule.

3 4 5

Select the Schedule tab. Click the Custom option button to enable the scheduling controls. Click next to the From field and specify when the execution schedule is to begin (Month, Day, Year, Hour, Minute) using the calendar tool. Specify the interval at which the executions tests are to be executed (Day, Hour, Minute). In the Run portion of the GUI, specify when the execution is to end. Select Forever to define a schedule with next to the to field and specify when the execution schedule is to end (Month, Day, no end, or click Year, Hour, Minute) using the calendar tool.

6 7

8 9

(Optional) Click Add Exclusion to define times when scheduled execution definitions should not be executed. Or click Add Definite Run to define times when unscheduled executions should be executed. Click Save to save your custom schedule.

Related Concepts Execution Definition Schedules Test Definition Execution Execution Definitions Related Procedures Managing Test Executions - Quick Start Task Adding Definite Runs Adding Exclusions Adding Execution Definitions Executing Test Definitions Related Reference Execution Schedule tab

228

Configuring Setup and Cleanup Executions


To define a test definition as a setup or cleanup test definition
1 2 3

Click Execution on the workflow bar. Click the execution definition for which you are assigning a setup or cleanup test definition. Click the Setup/Cleanup tab.

To define a setup test definition, proceed with the following step. To define a cleanup test definition, proceed with step 7.
4 5 6 7 8 9

Click Edit in the Setup Test Definition portion of the tab. The Edit Setup Test Definition dialog box displays. Browse through your projects test planning tree and select the test definition that is to serve as this execution definitions setup test definition. Click OK. The assigned test definition then displays in the Setup Test Definition list. Click Edit in the Cleanup Test Definition portion of the tab. The Edit Cleanup Test Definition dialog box displays. Browse through your projects test planning tree and select the test definition that is to serve as this execution definitions cleanup test definition. Click OK. The assigned test definition now displays in the Cleanup Test Definition list.

Related Concepts Setup and Cleanup Test Definitions Execution Definitions Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Executing Test Definitions Related Reference Execution Setup/Cleanup tab

229

Adding Dependent Execution Definitions


To add a dependent execution definition
1 2 3 4 5

Click Execution on the workflow bar. Select the execution definition that will act as the master execution definition. Select the Dependencies tab. Click Add dependent Execution Definition to display the Add dependent Execution Definition dialog box. From the Condition selection list, select the condition that is to trigger the dependent execution definition (Passed, Failed, Not Executed, or Any). The Any status means that the dependent test execution will trigger no matter what the status of the previous test execution. From the tree menu in the dialog box, select the execution definition that is to be dependent. Select one of the following options to specify where the dependent execution definition is to be deployed:

6 7

As specified in the dependent Execution Definition: Automated test definitions assigned to the dependent
execution definition will be executed on the execution server specified for the dependent execution definition on the Deployment tab. Manual test definitions assigned to the dependent execution definition will be assigned to the users specified for the dependent execution definition on the Deployment tab. dependent execution definition will be executed on the execution server specified for the <selected execution definitions execution server> on the Deployment tab. Manual test definitions assigned to the dependent execution definition will be assigned to the users specified for the <selected execution definitions execution server> on the Deployment tab. tester from the list boxes. Automated test definitions assigned to the dependent execution definition will be executed on the specified execution server. Manual test definitions assigned to the dependent execution definition will be assigned to the specified manual tester. If only a specific manual tester is defined and no server, only manual test definitions will be executed. If only a specific execution server is defined and no manual tester, only automated test definitions will be executed.

Same as <selected execution definitions execution server>: Automated test definitions assigned to the

Specific: Execution Server/Manual Tester: Select a pre-configured execution server and/or a manual

Click OK to create the dependency. Note: Note: Test Manager will not allow you to create cyclical execution dependencies. You can select conditions to fulfill for manual test definitions. (Example: If the selected condition is Failed and all manual tests passed, but some automated tests failed, only automated test definitions assigned to the dependent execution definition will be executed).

230

Related Concepts Execution Dependency Configuration Test Definition Execution Execution Definition Schedules Related Procedures Managing Test Executions - Quick Start Task Executing Test Definitions Related Reference Execution Dependencies tab

231

Assigning Keywords to Execution Definitions


To assign keywords to the selected execution definition:
1 2 3 4

Click Execution on the workflow bar. Select an execution definition in the Execution tree. Select the Deployment tab. Click Edit in the Execution Environment portion of the page. The Assign Keywords dialog box displays. All keywords that have been defined for your execution environment are listed here. Note: For automated execution definitions, the default reserved keywords for each execution server (#<execution name>@<location name>) are included in the list.

Select keywords in the Select keywords list that reflect your execution environment requirements. You can use your keyboard's CTRL and SHIFT keys to select multiple keywords using standard browser multi-select functions. Tip: The Select keywords field is auto-complete enabled. When you enter alphanumeric characters into this field, the field is dynamically updated with an existing keyword that matches the entered characters. Note that this field is disabled when multiple keywords are selected in the Select keywords or Assigned Keywords lists. For automated execution definitions, if you only have a few execution servers and do not require hardware provisioning, you can likely get by using only the default, reserved keywords that are created for each execution server. In such cases, it is not necessary that you select additional keywords.

Tip:

Click Add (>) to move the keyword into the Assigned Keywords list. Note: For automated execution definitions, the execution servers that match the assigned keywords are listed below in the dynamically-updated Matching execution servers list. This list updates each time you add or remove a keyword. Click on the name of an execution server in the list to access the execution servers in Administration Locations.

Click OK to save the keywords and close the Assign Keywords dialog box.

232

Related Concepts VMware Lab Manager Virtual Configurations Execution Definitions Test Definition Execution Execution Definition Schedules Related Procedures Managing Test Executions - Quick Start Task Configuring Deployment Environments Executing Test Definitions Creating New Keywords Removing Keywords from Execution Definitions Related Reference Execution Deployment tab

233

Executing Individual Tests


To run an execution definition independent of a schedule
1 2 3 4 5

Click Execution on the workflow bar. Select the execution definition that is to be run. Click Run on the toolbar. The Run dialog box displays. Define which test definitions you want to execute. The execution definition is then queued on the specified execution server. Test definitions are executed in the order in which they are listed on the Assigned Test Definitions tab (Execution View). Details of executions can be viewed in the Projects unit, Activities tab. Note: If the execution definition contains manual tests that are still in progress, you will be presented with a list of these tests.

If the execution definition does not contain pending manual tests, the Go To Activities dialog box displays. Click Yes to view the Activities page, or click No if you want to remain on the current Web page. Note: Check the Don't show this dialog again (during this login session) check box if you do not want to be asked about switching to the Activities page again in the future. This setting will be discarded when you log out of Test Manager.

Related Concepts Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Updating Execution Definitions Assigning Keywords to Execution Definitions SilkTest Tests Working with Manual Tests Executing Test Definitions Related Reference Execution Assigned Test Definitions Tab Activities Page Run Dialog

234

Viewing Test Execution Details


To view the details of a test execution
1 2 3 4 5

Click Execution on the workflow bar. Select an execution definition in the Execution tree view. Select the Runs tab. Click the Run ID of the execution for which you want to see details. Detailed information about the results of the execution definition is displayed.

Related Concepts Test Definition Execution Execution Definition Run Results Dialog Related Procedures Managing Test Executions - Quick Start Task Creating Test Definitions Working with Data-Driven Tests Executing Test Definitions Related Reference Test Definition Run Results Dialog Execution Runs Tab

235

Managing Test Plans - Quick Start Task


This quick start task details the tasks that are involved in managing test plans in SilkCentral Test Manager.

To manage test plans:


1

Create test definitions. Test definition configuration varies based on the test type you are creating (for example, automated, manual, data-driven). Creating Test Definitions Edit test definitions. Test definition configuration varies based on the test type you are editing (for example, automated, manual, data-driven). Editing Test Definitions Create test packages. Test packages provide additional details to the user concerning execution runs. Creating a Test Package If you are creating a data-driven test definition, have your system administrator configure a data source, then proceed as explained here. Creating Data-Driven Test Definitions Assign attributes to your test definitions. Assigning Attributes to Test Definitions For SilkPerformer tests, add predefined parameters to your test definitions. Adding Predefined Parameters to Test Definitions Create filters for your test plan. Creating Filters Assign requirements to your test definitions. Assigning Requirements to Test Definitions Attach files to your test definitions. Attaching Files to Test Plan Elements

Related Concepts Test Plan Management Related Procedures Managing Test Plans Working with Attachments Associating Requirements with Test Definitions Working with Data-Driven Tests Editing Test Plan Elements

236

Creating Test Definitions


To create a new test definition
1 2 3

Click Test Plan on the workflow bar. Select a container or folder node in the Test Plan tree where you want to insert a new test definition. Click New Test Definition on the toolbar or right-click within the tree and choose New Test Definition. A new test definition node is appended to the tree view, and the Test Definition dialog box opens. Specify a name and meaningful description for the test definition. Note: Test Manager supports HTML formatting as well as the cutting and pasting of HTML content for text boxes.

Select one of the following test definitions from the Type list box:

SilkTest test SilkPerformer test Manual test SilkTest Multi-testcase import NUnit test Windows scripting test JUnit test SilkTest plan
6

Click Next and proceed to the appropriate topic, as follows:

If you are configuring a SilkTest test, proceed to Configuring a SilkTest Test. If you are configuring a SilkPerformer test, proceed to Configuring a SilkPerformer Test. If you are configuring a manual test, proceed to Configuring a Manual Test. If you are configuring a SilkTest multi-testcase import, proceed to Configuring SilkTest Multi-Testcase
Import.

If you are configuring a NUnit test, proceed to Configuring an NUnit Test. If you are configuring a Windows scripting test, proceed to Configuring a Windows Scripting Test. If you are configuring a JUnit test, proceed to Configuring a JUnit Test. If you are configuring a SilkTest plan test, proceed to Configuring a SilkTest plan Test. If you are configuring a .NET Explorer test, proceed to Configuring a .NET Explorer Test.
Note: Test Manager's well-defined public API allows you to implement a proprietary solution that meets your automated test needs. Test Manager is open and extensible to any external tool that can be invoked from a Java implementation or through a command-line call.

237

Note:

Throughout the test-definition configuration process and across all test definition types, Inherit from parent check box options are provided where applicable, enabling you to accept settings of any existing parent entity.

Related Concepts Upload Manager Test Plan Management Test Definition Parameters Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Configuring SilkTest Test Properties Configuring SilkPerformer Test Properties Configuring Manual Test Properties Configuring JUnit Test Properties Configuring SilkTest Plan Properties Configuring NUnit Test Properties Configuring Windows Scripting Test Properties Configuring .Net Explorer Test Properties Editing Test Definitions Related Reference Test Plan Unit Interface APIs HTML Support for Description Text Boxes

238

Editing Test Definitions


To edit a test definition
1 2

Click Test Plan on the workflow bar. Select the test definition or the test package that you want to edit. Note: Test Manager supports HTML formatting as well as the cutting and pasting of HTML content for text boxes.

Click Edit on the toolbar or under the General Properties section in the tab view. The Edit Test Definition dialog box displays. Specify the name and description of the selected test definition. If the selected test definition is a test package, the Update Package Structure on Result check box is available. Check the Update Package Structure on Result check box if you want to update the structure of the test package according to the results of the test execution run.

Configure the properties of the test definition or the test package according to the test definition type.

Related Concepts Upload Manager Test Plan Management Test Definition Parameters Test Packages Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Configuring Test Definition Parameters Related Reference Test Plan Unit Interface APIs HTML Support for Description Text Boxes

239

Creating a Test Package


To create a new test package out of a third-party test definition
1 2

Run the test definition once to create the output.xml file, which contains the structure of the test package. In the Test Plan tree, right-click the name of the test definition and choose Convert to Test Package. The selected test definition is converted to a hierarchy representing the structure of the last execution result.

Related Concepts Usage of External IDs

240

Creating Data-Driven Test Definitions


To create a data-driven test definition
1

Click Test Plan on the workflow bar. Create a new test definition. See the topic, Creating a Test Definition for information about creating a test definition.

Select the newly created test definition's Properties tab. Scroll down to the Data-driven Properties section of the Properties tab and select the Edit icon to open the Data-driven Properties dialog box. Select a pre-configured data source from the Data Source list box. See SilkCentral Administration Module documentation for information on configuring data sources. Click Next to continue. Select a data set from the Data Set list box (in the case of Excel data sources, this is a worksheet name. In the case of database data sources, this is a table name). Check the Each data row is a single test definition check box to have each row in your data set considered to be a separate test definition, or do not check this check box to create a single test definition for all data rows of your data set. (optional) You can enter a SQL query into the Filter query field to filter your data set based on a SQL-syntax query. Note: Only simple WHERE clause queries are supported.

3 4 5 6

8 9

Check the Enable data-driven properties check box to enable data-driven functionality. Click Finish to save your settings. Note: Note: Data-driven property settings are visible in the lower portion of each test definitions Properties tab. To use Test Manager's data-driven test functionality with SilkPerformer scripts, data sources with column names matching the corresponding SilkPerformer project attributes must be used in conjunction with "AttributeGet" methods.

Related Concepts Manual Tests SilkTest Test Definitions Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Working with Data-Driven Tests Managing Test Plans Related Reference Test Plan Data Set tab

241

Assigning Attributes to Test Definitions


To assign an attribute to a test definition
1 2 3 4 5 6 7

Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select the test definition to which you are assigning an attribute. Select the Attributes tab. Click Add Attribute to display the Add Attributes dialog box. Click the plus symbol (+) of the attribute that you are assigning. Based on the attribute type you have selected (set or normal) you will be presented with an Edit Attribute dialog box, which allows you to specify which of the available attribute values youd like to assign to the test definition. Select the value required and click OK to assign the attribute. Note: A Set type attribute allows you to assign a set of values to an attribute. A Normal type attribute allows you to assign only a single value.

Related Concepts Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Generating Test Plans from Requirements View Related Reference Test Plan Attributes tab

242

Adding Predefined Parameters to Test Definitions


To add a predefined parameter to a test definition
1 2 3 4 5

Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select the test definition node for which you are adding a predefined parameter. Select the Parameters tab. Click Add Predefined Parameter to display the Add Predefined Parameter dialog box, which lists all of the project attributes that are available in the project file. Note: The Add Predefined Parameter button is only available for SilkPerformer test definitions for which the Project property has already been defined.

6 7 8

To add any of the listed parameters, click the corresponding add icon. On the dialog box that displays, specify the actual value for the parameter. Click Save to add the parameter to the active Test Plan tree node.

Related Concepts Test Definition Parameters Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Generating Test Plans from Requirements View Related Reference Test Plan Parameters tab

243

Creating Filters
To create a new custom filter:
1 2 3 4

Click the appropriate button (Execution, Requirements, or Test Plan) on the workflow bar. Click New Filter on the toolbar to display the New Filter dialog box. From the Property list box, select the property on which you wish to base the new filter (for example, Name, Description, Priority, Version and Build). From the Operator list box, select a logical operator to be applied to the specified property (for example, =, not, >, >=, <, <=, contains, anddoes not contains). Note: The contents of the Operator and Value list boxes vary based on the attribute selected in the Property field.

In the Value field, enter the value that the specified property is to be compared against. Note: For date-based properties, the Value field is replaced with a calendar tool that you can use to select a specific date.

6 7 8 9

Click Save and apply to open the Edit Filter dialog box. To apply the filter to the current view without saving the filter settings, click Apply. On the Edit Filter dialog box, enter a name for the filter in the Name field. Enter a meaningful description for the filter in the Filter field. Click OK to save the filter with your project.

Related Concepts Filtering Related Procedures Creating Advanced Filters Creating Global Filters Working with Filters

244

Assigning Requirements to Test Definitions


To manually assign requirements to test definitions
1 2 3

Click Test Plan on the workflow bar. Select the test definition to which you are assigning requirements. In Test Plan View, select the Assigned Requirements tab. All requirements that are available for assignment are displayed in the Available Requirements window. Note: The Available Requirements window can be broadened or narrowed by dragging the window splitter (the left-hand edge of the window) to the left or right.

Click the arrow of any requirement to assign it to the currently selected test definition. Note: Newly generated test definitions can automatically be assigned to the requirements from which they are generated by selecting the Assign newly generated test definitions to requirements on the Generate Test Plans from Requirements dialog box (the default behavior).

Related Concepts Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Working with Data-Driven Tests Managing Test Plans Generating Test Plans from Requirements View Related Reference Test Plan Assigned Requirements tab

245

Attaching Files to Test Plan Elements


To attach a file to a test plan element
1 2 3 4 5 6 7 8

Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select a container, folder, or test definition. Select the Attachments tab. Click Upload File to open the Upload File dialog box. Using Browse, select a file from your local file system. Enter a meaningful description for the attachment. Click Upload File to upload the attachment to the server and associate it with the selected element.

Note: Attaching files to a test plan element may not work in Mozilla Firefox. Firefox requires usage of three slashes (for example: "file:///") for a file link, while other browsers require only two (for example: "file://"). Aditionally, Firefox includes a security feature blocking links from remote files to local files and directories. For more information, see http://kb.mozillazine.org/Firefox_:_Issues_:_Links_to_Local_Pages_Don't_Work Related Concepts Attachments Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Working with Attachments Managing Test Plans Related Reference Test Plan Attachments tab

246

Managing a Successful Test


This section explains all of the procedures that you need to know to manage tests with Test Manager. In This Section Configuring Test Manager Settings This section explains how to configure Test Manager. Managing Requirements This section explains how to work with requirements in Test Manager. Managing Test Plans This section explains how to manage test plans in Test Manager. Executing Test Definitions This section explains how to execute test definitions with Test Manager. Managing Issues This section explains how to manage issues with SilkCentral Issue Manager. Managing Projects This section explains how to manage projects in Test Manager. Managing Activities This section explains how to manage upcoming, current, and recently-executed test runs. Managing Reports This section explains how to work with reports in Test Manager. Working with Filters This section explains how to work with custom filters in Test Manager. Analyzing Code Coverage This section explains how to perform code coverage analysis with Test Manager.

247

Configuring Test Manager Settings


This section explains how to configure Test Manager. In This Section Configuring Change Notification This section explains how to configure change notification. Configuring Custom Attributes This section explains how to configure custom attributes. Configuring Custom Step Properties This section explains how to configure custom step properties. Configuring Data Sources for Data-Driven Tests This section explains how to configure data sources for data-driven tests in SilkCentral Test Manager. Configuring Global Filters This section explains how to configure global filters. Configuring Issue Tracking Profiles This section explains how to configure issue tracking profiles to integrate Test Manager with external issue tracking systems. Configuring Source Control Profiles This section explains how to configure source control profiles to integrate Test Manager with external source control systems. Configuring Project Settings How to configure Test Manager project settings.

248

Configuring Change Notification


This section explains how to configure change notification. In This Section Disabling Change Notification Describes how to deactivate change notification. Enabling Change Notification Describes how to enable change notification.

249

Disabling Change Notification


To deactivate change notification
1 2 3 4 5 6

Navigate to Test Manager Select the Notifications tab.

Settings.

Click Configure Notification to open the Configure Change Notification dialog box. If you do not want to be notified by email when changes are made to requirements in the currently selected project, uncheck the Changes on Requirements check box. If you do not want to be notified by email when changes are made to test plans in the currently selected project, uncheck the Changes on Test Plan check box. Click OK to save the notification settings; or click Cancel to abort the operation without saving changes.

Related Concepts Change Notification Settings Configuration Related Procedures Configuring Projects - Quick Start Task Configuring Test Manager Settings Related Reference Notifications Page

250

Enabling Change Notification


To enable change notification
1 2 3 4 5 6 7

Navigate to Test Manager Select the Notifications tab.

Settings.

Click Configure Notification to open the Configure Change Notification dialog box. If you want to be notified by email when changes are made to requirements in the currently selected project, check the Changes on Requirements check box. If you want to be notified by email when changes are made to test plans within the currently selected project, check the Changes on Test Plan check box. Click OK to save the notification settings, or click Cancel to abort the operation without saving changes. You will be notified by email about the changes you have activated.

Related Concepts Change Notification Settings Configuration Related Procedures Configuring Projects - Quick Start Task Configuring Test Manager Settings Related Reference Notifications Page

251

Configuring Custom Attributes


This section explains how to configure custom attributes. In This Section Creating Custom Attributes Describes how to create a custom attribute. Deleting Custom Attributes Describes how to delete a custom attribute. Editing Custom Attributes Describes how to edit a custom attribute.

252

Creating Custom Attributes


To create a custom attribute
1

Navigate to Test Manager Note:

Settings.

If you have not already selected a project, a warning message displays, asking you to select a project. Select the project for which you are defining custom attributes.

2 3 4 5 6 7

Select the Attributes tab to view the list of current attributes. Click New Attribute. The New Attribute dialog box displays. Enter a Name for the new attribute. This name will be displayed in list boxes when the attribute becomes available for use. Enter a Description of the new attribute. Select the Attribute type. Depending on the attribute type you have selected, you can now continue as follows: If you have selected the attribute type Edit, you can now click OK to save the new custom attribute, or click Cancel to abort the operation. If you have selected the attribute type Normal or Set, you can define values. To define a new value, click New Value. Enter the value into the Value field on the New Value dialog box and click OK. The new value is then listed in the Value table, where you can edit it by clicking the name of the value; or you can delete it by clicking on the Delete icon. Click OK to save the new attribute; or click Cancel to abort the operation. You will be returned to the Attributes list; the new attribute is now listed.

Related Concepts Attributes Settings Configuration Related Procedures Configuring Projects - Quick Start Task Configuring Test Manager Settings Related Reference Attributes tab

253

Deleting Custom Attributes


To delete a custom attribute
1

Navigate to Test Manager Note:

Settings.

If you have not already selected a project, a warning message displays, asking you to select a project. Select the project for which you are defining global settings.

2 3 4 5 6

Select the Attributes tab to view the list of current attributes. Before you can delete an attribute, you must first deactivate it. In the Status column, click the Active link or icon and then click Yes on the confirmation dialog box to deactivate the attribute. Once the attribute is inactive, click the delete icon of the attribute to remove it. A confirmation dialog box displays, asking you to confirm the deletion. Click Yes to remove the selected attribute; or click No to abort the operation. If you select Yes you will be returned to the Attributes list, where the removed attribute will no longer be displayed. If an error displays, ensure that the selected attribute is not applied to any test definitions or used in any global filters. You can only delete unused attributes.

Related Concepts Attributes Settings Configuration Related Procedures Configuring Projects - Quick Start Task Configuring Test Manager Settings Related Reference Attributes tab

254

Editing Custom Attributes


To edit a custom attribute
1

Navigate to Test Manager Note:

Settings.

If you have not already selected a project, a warning message displays, asking you to select a project. Select the project for which you are defining custom attributes.

2 3 4

Select the Attributes tab to view the list of current attributes. Click the name of the attribute that you are editing. The Edit Attribute dialog box displays. You can edit the Name of the attribute. The name will be displayed in list boxes when the attribute is available for use:

Filters: Attributes can be used in global filters for filtering by test definition attributes (see Global
Filters ).

Test Plan unit: Attributes can be applied to test definitions. (see Understanding Test Definition
Attributes ).
5 6

You can edit the Description of the attribute. Depending on the attribute type, you can continue as follows: If the attribute type is Edit, you can now click OK to save the new custom attribute, or click Cancel to abort the operation. If the attribute type is Normal or Set, you can add, edit or remove values. To define a new value, click New Value. Enter the value into the Value field on the New Value dialog box and click OK. The new value is now listed in the Value table, where you can edit it by clicking the name of the value; or you can delete it by clicking the delete icon.

7 8

Once you are satisfied with your attribute settings, click OK to save the changes; or click Cancel to abort the operation. You will be returned to the Attributes list.

Related Concepts Attributes Settings Configuration Related Procedures Configuring Projects - Quick Start Task Configuring Test Manager Settings Related Reference Attributes tab

255

Configuring Custom Step Properties


This section explains how to configure custom step properties. In This Section Creating Custom Step Properties Describes how to create a custom step property. Deleting Custom Step Properties Describes how to delete a custom step property. Editing Custom Step Properties Describes how to edit a custom step property.

256

Creating Custom Step Properties


To create a new custom step property
1 2 3 4

Navigate to Test Manager

Settings .

Select the Step Properties tab. Click New Property to display the New Custom Step Property dialog box. Enter a name for the new property in the Name field. Note: Custom step property fields are always declared as type string.

Click OK to make your custom property available to all manual test steps in the selected Test Manager project.

Related Concepts Settings Configuration Related Procedures Configuring Projects - Quick Start Task Custom Step Properties Configuring Test Manager Settings Related Reference Step Properties Page

257

Deleting Custom Step Properties


To delete a previously created custom step property
1 2 3 4

Navigate to Test Manager

Settings .

Select the Step Properties tab. Click the delete icon of the custom property you want to delete. A confirmation dialog box displays, asking you to confirm the deletion. Click Yes to complete the operation, or No to abort.

Related Concepts Settings Configuration Related Procedures Configuring Projects - Quick Start Task Custom Step Properties Configuring Test Manager Settings Related Reference Step Properties Page

258

Editing Custom Step Properties


To edit a previously created custom step property
1 2 3 4 5

Navigate to Test Manager

Settings .

Select the Step Properties tab. Click the name of the custom property that you are editing. The Edit Custom Step Property dialog opens. Edit the name of the property in the Name field. Click OK to save your changes, or click Cancel to abort the operation without saving.

Related Concepts Settings Configuration Related Procedures Configuring Projects - Quick Start Task Custom Step Properties Configuring Test Manager Settings Related Reference Step Properties Page

259

Configuring Data Sources for Data-Driven Tests


This section explains how to configure data sources for data-driven tests in SilkCentral Test Manager. In This Section Configuring JDBC Data Sources Describes how to configure a JDBC data source for data-driven tests. Configuring Microsoft Excel or CSV Data Sources Describes how to configure a Microsoft Excel or CSV data source for data-driven tests. Deleting Data Sources Describes how to delete a data source from SilkCentral. Downloading Excel Files from a Data Source Describes how to download an Excel file from a SilkCentral data source. Synchronizing Data Sources Describes how to synchronize SilkCentral Test Manager test definitions with an updated data source. Uploading Updated Excel Files to a Data Source Describes how to upload an updated Excel file to the data source in SilkCentral.

260

Configuring JDBC Data Sources


Describes how to configure a JDBC data source for data-driven tests.

To configure a JDBC data source


1 2 3 4

Select the Data Sources tab from Test Manager Settings in the menu tree. The Data Sources page displays, listing all of the data sources that have been created for the system. Click New Data Source to open the New Data Source dialog box. Specify a Name for the data source. From the Data source type list box, select JDBC. Note: If you are setting up an ODBC data source, you need to manually insert your ODBC driver class and URL (for example, Driver class: sun.jdbc.odbc.JdbcOdbcDriver, URL: jdbc:odbc:MyDatabaseName). You must also set up an ODBC data source in MS Windows in the Administrative Tools (please refer to Microsoft Windows Help for more information). If you have your front-end server and your application server on different machines, make sure that the name of your system data source set up in Microsoft Windows is the same as the ODBC data source. These names are case-sensitive.

The Driver class field is populated automatically when you select JDBC as the Data source type. In the URL field, replace the host name value (<hostname>) with the name of the computer that is hosting the data source and replace the database name value (<databasename>) with the name of the target database.

6 7

In the Username and Password fields, enter valid database credentials. (Optional) If you are working with a database that includes multiple tables, and you want to narrow down the data source to specific tables, you can browse to and select specific tables for inclusion:
1 2 3

Click [...] next to the Table filter field. The Select Table Filter dialog box displays. Select the tables that you want included as your data source. Click OK.

(Optional) Key column selection is used by test definitions to define which worksheet columns within a data source are used as primary key. This is helpful if your data source will undergo edits (for example, adding or removing rows within a worksheet). Even if your data source is edited, test definitions will still be able to identify which columns/rows should be used. Test definitions created from data-driven data sources use key column values in their names, rather than column numbers. To configure a key column:
1 2 3

Click [...] next to the Key column field. The Select Key Column dialog box displays. Select a column from the column list that is to act as a key column. Click OK.

Click OK on the New Data Source dialog box.

261

Related Concepts Data Sources for Data-Driven Tests Related Procedures Configuring Microsoft Excel or CSV Data Sources Synchronizing Data Sources Uploading Updated Excel Files to a Data Source Downloading Excel Files from a Data Source Deleting Data Sources Related Reference Data Sources Configuration Page

262

Configuring Microsoft Excel or CSV Data Sources


Describes how to configure a Microsoft Excel or CSV data source for data-driven tests. Warning: Worksheets using Excel's password protection can not be configured as data source for SilkCentral Test Manager. Turn off a worksheet's password protection to use it as data source for data-driven testing. Warning: Values within any cell of an Excel or CSV data source may not exceed 255 characters in length. Additionally, the concatenated length of all primary keys may also not exceed 255 characters in length, both for Excel/CSV as well as for JDBC data sources.

To configure a Microsoft Excel or CSV data source


1 2 3 4 5

Select the Data Sources tab from Test Manager Settings in the menu tree. The Data Sources page displays, listing all of the data sources that have been created for the system. Click New Data Source to open the New Data Source dialog box. Specify a Name for the data source. From the Data source type list box, select MS Excel to configure a Microsoft Excel data source, or select CSV to configure a CSV data source. From the Source control profile list box, select the pre-configured source control profile that hosts your data file. See the related Source Control Profiles topic for detailed information regarding the configuration of source control profiles. Click Browse to open the Select Source Control Path dialog box. Browse to and select a data source file of the selected type in your source control path. MS Excel only: (Optional) If you are working with an Excel spreadsheet that includes multiple worksheets, and you want to narrow down the data source to specific worksheets, you can browse to and select specific worksheets for inclusion. To do this:
1 2 3

6 7

Click [...] next to the Worksheet filter field. The Select Worksheet Filter dialog box displays. Select the worksheets that you want included as your data source. Click OK.

(Optional) Key column selection is used by test definitions to define which worksheet columns within a data source are used as primary key. This is helpful if your data source will undergo edits (for example, adding or removing rows within a worksheet). Even if your data source is edited, test definitions will still be able to identify which columns/rows should be used. Test definitions created from data-driven data sources use key column values in their names, rather than column numbers. Note: MS Excel only: If the data source includes multiple worksheets, only columns with identical names are available to be defined as key columns.

To configure a key column:


1 2 3

Click [...] next to the Key column field. The Select Key Column dialog box displays. Select a column from the column list that is to act as a key column. Click OK.

263

Click OK on the New Data Source dialog box.

Related Concepts Data Sources for Data-Driven Tests Related Procedures Configuring JDBC Data Sources Synchronizing Data Sources Uploading Updated Excel Files to a Data Source Downloading Excel Files from a Data Source Deleting Data Sources Related Reference Data Sources Configuration Page

264

Deleting Data Sources


Describes how to delete a data source from SilkCentral. Note: Data sources that are being used by test definitions can not be deleted.

To delete a data source


1 2 3

Select the Data Sources tab from Test Manager Settings in the menu tree. The Data Sources page displays, listing all of the data sources that have been created for the system. Click the Delete icon in the Actions column that corresponds to your data source. A confirmation dialog box displays. Click Yes to remove the data source, or click No to abort the deletion.

Related Concepts Data Sources for Data-Driven Tests Related Procedures Configuring Microsoft Excel or CSV Data Sources Configuring JDBC Data Sources Synchronizing Data Sources Uploading Updated Excel Files to a Data Source Downloading Excel Files from a Data Source Related Reference Data Sources Configuration Page

265

Downloading Excel Files from a Data Source


Describes how to download an Excel file from a SilkCentral data source. Note: Files can not be downloaded from JDBC and ODBC data sources.

To download an Excel file from a data source


1 2 3

Select the Data Sources tab from Test Manager Settings in the menu tree. The Data Sources page displays, listing all of the data sources that have been created for the system. Click the Download icon in the Actions column that corresponds to your data source. The File Download dialog box displays. Click Open to open the file immediately, or click Save to specify where on your local system you want to save the file to.

Related Concepts Data Sources for Data-Driven Tests Related Procedures Configuring Microsoft Excel or CSV Data Sources Configuring JDBC Data Sources Synchronizing Data Sources Uploading Updated Excel Files to a Data Source Deleting Data Sources Related Reference Data Sources Configuration Page

266

Synchronizing Data Sources


Describes how to synchronize SilkCentral Test Manager test definitions with an updated data source. You must synchronize a data source each time it is changed or updated, if you want to make SilkCentral aware of the changes. Synchronizing a data source propagates recent changes to associated test definitions.

To synchronize an updated data source


1 2 3

Select the Data Sources tab from Test Manager Settings in the menu tree. The Data Sources page displays, listing all of the data sources that have been created for the system. Click the Synchronize icon in the Actions column that corresponds to your data source to propagate the updated file to the associated test definitions. A confirmation dialog box displays, asking you to confirm the synchronization. Click Yes to synchronize all test definitions with the updated data source, or click No to abort the synchronization. Warning: All running executions depending on this data source will be aborted. Results of incomplete test definitions within these executions will get lost.

Click OK on the success message dialog box.

Related Concepts Data Sources for Data-Driven Tests Related Procedures Configuring Microsoft Excel or CSV Data Sources Configuring JDBC Data Sources Uploading Updated Excel Files to a Data Source Downloading Excel Files from a Data Source Deleting Data Sources Related Reference Data Sources Configuration Page

267

Uploading Updated Excel Files to a Data Source


Describes how to upload an updated Excel file to the data source in SilkCentral. Note: Files can not be uploaded to JDBC and ODBC data sources.

To upload an updated Excel file to a data source


1 2 3 4 5 6 7

Select the Data Sources tab from Test Manager Settings in the menu tree. The Data Sources page displays, listing all of the data sources that have been created for the system. Click the Upload icon in the Actions column that corresponds to your data source. Click Browse... on the Upload File dialog box. Browse to and select the updated Excel file that is to replace the currently uploaded Excel file. Click Open. Click OK on the Upload File dialog box. A confirmation dialog box displays, asking you to confirm the overwriting of the existing file. Click Yes to continue. After uploading the updated data source file, another dialog box displays, asking you if you want to synchronize the test definitions with the updated data source. Click Yes to synchronize immediately, or click No if you want to synchronize later. Note: After uploading an updated data source file, you must synchronize the data source so that associated test definitions are updated. See the related Synchronizing data sources procedure for details.

Related Concepts Data Sources for Data-Driven Tests Related Procedures Configuring Microsoft Excel or CSV Data Sources Configuring JDBC Data Sources Synchronizing Data Sources Downloading Excel Files from a Data Source Deleting Data Sources Related Reference Data Sources Configuration Page

268

Configuring Global Filters


This section explains how to configure global filters. In This Section Creating Global Filters Describes how to create a global filter. Deleting Global Filters Describes how to delete a global filter. Editing Global Filters How to edit a global filter.

269

Creating Global Filters


To create a global filter
1

Navigate to Test Manager Note:

Settings .

If you have not already selected a project, a warning message will appear, asking you to select a project. Select the project for which you are defining global settings.

2 3 4 5

Select the Filters tab to view the list of available filters. Click New Filter. The New Filter dialog box displays. Enter a Name for the new filter. This name will be displayed in list boxes when the filter becomes available. Select a Category for the new filter from the list box to make the filter available in a specific Test Manager unit:

Requirement Filter The filter will be available in the Requirements Management unit. Test Definition Filter The filter will be available in the Test Plan Management unit. Execution The filter will be available in the Test Execution Management unit.
6 7

Enter a Description of the new filter. Select a category of filter criteria (Selection criteria). The available categories depend on the general filter category you have selected. You can also combine filters by selecting Nested Test Definition Filter or Nested Requirements Filter. Selecting one of these categories allows you to include an existing test definition filter (for example, an existing requirements filter) in your new filter.

Select a Property, Operator, and Value for the new filter from the respective list boxes.

Property Available properties depend on the filter category that you have selected in the previous step.
It defines the property for which you are defining a filter setting. If you have selected an attribute category, the property list includes custom attributes to query against.

Operator Specifies the filter operator. The operator depends on the property type you have selected.
For example, if you have selected a property that is based on a string field type, the available operators are = (equals defined value) not (differs from the defined value) contains (contains the defined value somewhere in the string) not contains (does not contain the defined value in the string) values will either be strings that you can enter into the text box, or a selection of predefined values that you can select from the list box.

Value Enter the value that you want to filter out. Depending on the property type that you have selected,

Click More if you want to add more than one filter category to the new filter. Repeat this procedure to define new categories. If you define more than one filter category, you must define whether the categories need to be fulfilled in addition to the existing categories (AND relationship), or if the filter returns true when any of the filter categories are fulfilled (OR relationship). Select either AND or OR to define the filter category relationship. You cant define nested AND, OR relationships. To remove filter categories, click Fewer. This removes the last filter category. When you are done, click OK to save the new filter, or click Cancel to abort the operation.

270

Related Concepts Global Filters Settings Configuration Related Procedures Configuring Projects - Quick Start Task Configuring Test Manager Settings Creating Filters Related Reference Filters tab

271

Deleting Global Filters


To delete a global filter
1

Navigate to Test Manager Note:

Settings .

If you have not already selected a project, a warning message displays, asking you to select a project. Select the project for which you are defining global settings.

2 3 4

Select the Filters tab to view the list of current filters. Click the delete icon of the filter that you want to remove. A confirmation dialog box displays, asking you you to confirm the deletion. Click Yes to remove the selected filter; or No to abort the operation. If you select Yes, you will be returned to the filters list; the removed filter will no longer be displayed.

Related Concepts Global Filters Settings Configuration Related Procedures Configuring Projects - Quick Start Task Configuring Test Manager Settings Deleting Filters Related Reference Filters tab

272

Editing Global Filters


To edit a global filter:
1

Navigate to Test Manager Note:

Settings .

If you have not selected a project, a warning message will appear, asking you to select a project. Select the project for which you want to define global settings.

2 3 4

Select the Filters tab to view the list of current filters. Click the name of the filter you are editing. The Edit Filter dialog box displays. You can edit the following filter properties:

Name of the filter This name will be displayed in list boxes when the filter is available Description of the filter This provides a meaningful way to identify what the filter does Categories for filter criteria. You can change, add, or remove categories for filter criteria. The available
categories depend on the general category of filters.
5

You can also combine filters by selecting Nested Test Definition Filter or Nested Requirements Filter in the Selection Criteria categories. Selecting either of those categories allows you to include an existing test definition filter (that is, an existing requirements filter) in your new filter. Select a Property, Operator, and Value for the filter from the respective list boxes.

Property Available properties depend on the filter category you have selected in the previous step. It
defines the property you want to define a filter setting for. If you have selected an attribute category, the property list includes custom attributes to query against. See Attributes for detailed information about defining custom attributes. selected. For example, if you have selected a property that is based on a string field type, the available operators are: = (equals defined value) ) not (differs from the defined value) contains (contains the defined value somewhere in the string) not contains (does not contain the defined value in the string)

Operator Select the filter operator. The available operators depend on the property type that you have

Value Enter the value that you want to filter out. Depending on the property type that you have selected,
values will either be strings that you can enter into the text box, or they will be a selection of predefined values that you can select from the list box.

Click More if you want to add more than one filter category to the filter. Proceed by defining new categories. If you define more than one filter category, you must define whether the categories need to be fulfilled in addition to the existing categories (AND relationship), or if the filter returns true when any of the filter categories are fulfilled (OR relationship). Select eitherAND or OR to define the filter category relationship. Note: Note: If you define more than two filter categories, selecting AND (or OR) defines the relationship between all categories. You cannot define nested AND, OR relationships. To remove filter categories, click Fewer. This removes the last filter category.

273

Click OK to save the edited filter definition.

Related Concepts Global Filters Settings Configuration Related Procedures Configuring Projects - Quick Start Task Configuring Test Manager Settings Editing Filters Related Reference Filters tab

274

Configuring Issue Tracking Profiles


This section explains how to configure issue tracking profiles to integrate Test Manager with external issue tracking systems. In This Section Deleting Issue Tracking Profiles Describes how to remove existing issue tracking profiles. Managing SilkCentral Issue Manager Issue Tracking Profiles This section describes how to configure SilkCentral Issue Manager issue tracking profiles to integrate with Test Manager. Managing Borland StarTeam Issue Tracking Profiles This section describes how to configure Borland StarTeam issue tracking profiles to integrate with Test Manager. Managing Bugzilla Issue Tracking Profiles This section describes how to configure Bugzilla issue tracking profiles to integrate with Test Manager. Managing IBM Rational ClearQuest Issue Tracking Profiles This section describes how to configure IBM Rational ClearQuest issue tracking profiles to integrate with Test Manager.

275

Deleting Issue Tracking Profiles


Describes how to remove existing issue tracking profiles.

To delete an existing issue tracking profile:


1 2 3

Select the Issue Tracking tab from Administration Settings in the menu tree. The Issue Tracking page displays, listing all of the issue tracking profiles that have been created for the system. Click the Delete icon of the issue tracking profile you want to delete. A confirmation dialog box displays. Click Yes to remove the selected profile. You are returned to the Issue Tracking page.

Related Concepts Issue Tracking Profiles Related Reference Issue Tracking Profiles Page

276

Managing SilkCentral Issue Manager Issue Tracking Profiles


This section describes how to configure SilkCentral Issue Manager issue tracking profiles to integrate with Test Manager.

In This Section
1

Adding a new Issue Manager issue tracking profile: Adding SilkCentral Issue Manager Issue Tracking Profiles Mapping the existing issue states of Issue Manager to the states of Test Manager: Mapping Issue States Editing an existing Issue Manager issue tracking profile: Editing SilkCentral Issue Manager Issue Tracking Profiles Deleting an existingIssue Manager issue tracking profile: Deleting Issue Tracking Profiles

Related Concepts Issue Tracking Profiles Related Reference Issue Tracking Profiles Page

277

Adding SilkCentral Issue Manager Issue Tracking Profiles


Describes how to create SilkCentral Issue Manager issue tracking profiles.

To add a SilkCentral Issue Manager issue tracking profile:


1

To launch the New Issue Tracking Profile dialog: Select Issue Manager from the Type list box, or select Issue Manager 3.3 to connect to an Issue Manager version 3.3 installation. Type a valid Issue Manager Username and Password. These credentials will be used to access your Issue Manager system. Type the Issue Manager URL of your Issue Manager installation. This is the URL you use to login to Issue Manager, though without the login extension at the end. For example, if your standard Issue Manager URL is http://IssueManager/login, then the correct service URL is http://IssueManager.

2 3 4

If you selected Issue Manager 3.3 from the Type list box, proceed with the next step. If you selected Issue Manager from the Type list box, proceed as follows:
1

Click Load Projects. This action will populate the Project list box with all initialized Issue Manager projects to which the specified user has access to. Note that only those projects display for which Issue Manager user groups have been defined, and the defined user is a member of at least one user group. Select the Project where Issue Manager issues are maintained. Warning: Borland recommends not to use identical projects for Issue Manager and Test Manager, as this limits flexibility in working with both tools on different future projects.

Click OK. Test Manager attempts a trial connection to Rational ClearQuest using the information you have provided. Note: If an error occurs, please review the login credentials and the service URL that you have supplied, or consult your Issue Manager administrator.

If the trial connection to Issue Manager is successful, a confirmation dialog box displays, asking you if you want to map internal issue states to the states of the newly defined profile.

Click Yes to proceed with the related Mapping Issue States procedure. Click No to map issue states later.

278

Related Concepts Issue Tracking Profiles Related Procedures Managing SilkCentral Issue Manager Issue Tracking Profiles Mapping Issue States Related Reference Issue Tracking Profiles Page

279

Mapping Issue States


Describes how to map the existing issue states of an external issue tracking system to the issue states of Test Manager. After defining a new issue tracking profile, you should map the existing issue states of the external issue tracking system to the issue states of Test Manager. Doing this enables Test Manager to list issues correctly when querying internal and external issues.

To map issue states:


1 2 3

Click the Issue Tracking tab in Test Manager Settings . The Issue Tracking page opens, listing all of the issue tracking profiles that have been created for the system. Click the Edit state mapping icon of the issue tracking profile you want to edit. The Edit Status Mapping dialog box opens, listing all existing issue states of the external issue tracking software. These states are listed in the External column. The internal issue states of Test Manager are available in the list boxes in the Internal column. Map internal issue states to corresponding external issue states by selecting the respective entries from the list boxes. Once you have mapped each external issue state to an internal state, click OK to save your settings, or click Cancel to abort the operation.

Related Concepts Issue Tracking Profiles Related Reference Issue Tracking Profiles Page

280

Editing SilkCentral Issue Manager Issue Tracking Profiles


Describes how to modify existing SilkCentral Issue Manager issue tracking profiles. Tip: When the server or login credentials of your issue tracking system change, you must edit your issue tracking profile accordingly.

To edit an existing Issue Manager issue tracking profile


1

To launch the Edit Issue Tracking Profile dialog: Edit the Name of the profile. This is the name that will be displayed in issue-tracking profile lists. Edit the Description of the profile. Edit theIssue Manager Username and Password. These credentials are used to access your Issue Manager system. Edit the Issue Manager URL of your Issue Manager installation if the location has changed. This is the URL you use to login to Issue Manager, though without the login extension at the end. Example: If your standard Issue Manager URL is http://IssueManager/login, then the correct service URL would be http://IssueManager.

2 3 4 5

Click OK. Test Manager attempts a trial connection to SilkCentral Issue Manager using the information you have provided. Note: If an error occurs, review the login credentials and the repository info that you have supplied, or consult your SilkCentral Issue Manager administrator.

If the trial connection to SilkCentral Issue Manager is successful, you are returned to the Issue Tracking page.

Related Concepts Issue Tracking Profiles Related Procedures Managing SilkCentral Issue Manager Issue Tracking Profiles Mapping Issue States Related Reference Issue Tracking Profiles Page

281

Deleting Issue Tracking Profiles


Describes how to remove existing issue tracking profiles.

To delete an existing issue tracking profile:


1 2 3

Select the Issue Tracking tab from Administration Settings in the menu tree. The Issue Tracking page displays, listing all of the issue tracking profiles that have been created for the system. Click the Delete icon of the issue tracking profile you want to delete. A confirmation dialog box displays. Click Yes to remove the selected profile. You are returned to the Issue Tracking page.

Related Concepts Issue Tracking Profiles Related Reference Issue Tracking Profiles Page

282

Managing Borland StarTeam Issue Tracking Profiles


This section describes how to configure Borland StarTeam issue tracking profiles to integrate with Test Manager.

In This Section
1

Adding a new StarTeam issue tracking profile: Adding Borland StarTeam Issue Tracking Profiles Mapping the existing issue states of StarTeam to the states of Test Manager: Mapping Issue States Editing an existing StarTeam issue tracking profile: Editing Borland StarTeam Issue Tracking Profiles Deleting an existingStarTeam issue tracking profile: Deleting Issue Tracking Profiles

Related Concepts Issue Tracking Profiles Related Reference Issue Tracking Profiles Page

283

Adding Borland StarTeam Issue Tracking Profiles


Describes how to create Borland StarTeam issue tracking profiles.

To add a Borland StarTeam issue tracking profile:


1

To launch the New Issue Tracking Profile dialog: Select StarTeam from the Type list box. Type a valid StarTeam Username and Password. These credentials are used to retrieve the status of existing StarTeam change requests and information required for entering new issues. Type the Hostname of your StarTeam server and the Port that is used to connect to the StarTeam server. If this setting has not been changed, use the default port 49201. Specify the type of Encryption that the profile supports. Click Load Project to load all projects from the server and populate the Project list box, then select a project from the Project list box. Click Load View to load all views for the selected project and populate the View list box, then select a view from the View list box. Click Load Status Field to load all enumeration fields for change requests and populate the Status Field list box, then select a status field from the Status Field list box. If you are using a custom workflow in StarTeam, this field is the workflow driver field in StarTeam that maps to the Test Manager issue state. Click OK. Test Manager attempts a trial connection to Borland StarTeam using the information you have provided. Note: If an error occurs, please review the login credentials and other StarTeam information you have supplied, or consult your StarTeam administrator.

2 3 4 5 6 7 8

10 If the trial connection to StarTeam is successful, a confirmation dialog box displays, asking you if you want to

map internal issue states to the states of the newly defined profile.

Click Yes to proceed with the related Mapping Issue States procedure. Click No to map issue states later.

Related Concepts Issue Tracking Profiles Related Procedures Managing Borland StarTeam Issue Tracking Profiles Mapping Issue States Related Reference Issue Tracking Profiles Page

284

Mapping Issue States


Describes how to map the existing issue states of an external issue tracking system to the issue states of Test Manager. After defining a new issue tracking profile, you should map the existing issue states of the external issue tracking system to the issue states of Test Manager. Doing this enables Test Manager to list issues correctly when querying internal and external issues.

To map issue states:


1 2 3

Click the Issue Tracking tab in Test Manager Settings . The Issue Tracking page opens, listing all of the issue tracking profiles that have been created for the system. Click the Edit state mapping icon of the issue tracking profile you want to edit. The Edit Status Mapping dialog box opens, listing all existing issue states of the external issue tracking software. These states are listed in the External column. The internal issue states of Test Manager are available in the list boxes in the Internal column. Map internal issue states to corresponding external issue states by selecting the respective entries from the list boxes. Once you have mapped each external issue state to an internal state, click OK to save your settings, or click Cancel to abort the operation.

Related Concepts Issue Tracking Profiles Related Reference Issue Tracking Profiles Page

285

Editing Borland StarTeam Issue Tracking Profiles


Describes how to modify existing Borland StarTeam issue tracking profiles. Tip: When the server or login credentials of your issue tracking system change, you must edit your issue tracking profile accordingly.

To edit an existing StarTeam issue tracking profile:


1

To open the Edit Issue Tracking Profile dialog box:

Edit the Name and the Description of the profile. Edit theStarTeam Username and Password. These credentials are used to access your StarTeam system. 3
2 4 5 6

Edit the Hostname of your StarTeam server and the Port that is used to connect to the StarTeam server. Modify the type of Encryption that the profile supports. To change the StarTeam project, click Load Project to load all projects from the server and update the Project list box, then select a project from the Project list box. Note: Reload the View list box to display the updated list of views when you change a project. Click Load View to load all views for the selected project and populate the View list box, then select a view from the View list box.

To change the workflow driver field, click Load Status Field to load all enumeration fields for change requests and populate the Status Field list box, then select a status field from the Status Field list box. If you are using a custom workflow in StarTeam, this field is the workflow driver field in StarTeam that maps to the Test Manager issue state. Click OK. Test Manager attempts a trial connection to Borland StarTeam using the information you have provided. Note: If an error occurs, review the login credentials and the repository info that you have supplied, or consult your Borland StarTeam administrator.

If the trial connection to Borland StarTeam is successful, you are returned to the Issue Tracking page.

Related Concepts Issue Tracking Profiles Related Procedures Managing Borland StarTeam Issue Tracking Profiles Mapping Issue States Related Reference Issue Tracking Profiles Page

286

Deleting Issue Tracking Profiles


Describes how to remove existing issue tracking profiles.

To delete an existing issue tracking profile:


1 2 3

Select the Issue Tracking tab from Administration Settings in the menu tree. The Issue Tracking page displays, listing all of the issue tracking profiles that have been created for the system. Click the Delete icon of the issue tracking profile you want to delete. A confirmation dialog box displays. Click Yes to remove the selected profile. You are returned to the Issue Tracking page.

Related Concepts Issue Tracking Profiles Related Reference Issue Tracking Profiles Page

287

Managing Bugzilla Issue Tracking Profiles


This section describes how to configure Bugzilla issue tracking profiles to integrate with Test Manager. The Bugzilla plug-in relies on the SilkCentral Java API for integration. The Bugzilla integration has been tested with Bugzilla 2.22. You do not have to modify your Bugzilla installation to enable integration. Test Manager communicates with Bugzilla through the Bugzilla Web GUI by using the HttpClient library from Jakarta Commons. Note: See the sources of com.segue.scc.issuetracking.bugzilla.BugzillaProfile and com.segue.scc.issuetracking.bugzilla.BugzillaIssue to see how these elements fit together.

The following tasks are described in this section:


1

Adding a new Bugzilla issue tracking profile: Adding Bugzilla Issue Tracking Profiles Mapping the existing issue states of Bugzilla to the states of Test Manager: Mapping Issue States Editing an existing Bugzilla issue tracking profile: Editing Bugzilla Issue Tracking Profiles Deleting an existing Bugzilla issue tracking profile: Deleting Issue Tracking Profiles

Related Concepts Issue Tracking Profiles Related Reference Issue Tracking Profiles Page

288

Adding Bugzilla Issue Tracking Profiles


Describes how to create Bugzilla issue tracking profiles.

To add a Bugzilla issue tracking profile:


1

To launch the New Issue Tracking Profile dialog: Select Bugzilla from the Type list box. Type a valid Bugzilla Username and Password. These credentials are used to retrieve the status of existing StarTeam change requests and information required for entering new issues. Enter the URL of your Bugzilla installation. For example, http://bugzillaserver/cgi-bin/ bugzilla/. Note: To establish a connection to Bugzilla, the URL must end with a slash (/).

2 3 4

Click OK. Test Manager attempts a trial connection to Bugzilla using the information you have provided. Note: If an error occurs, please review the login credentials and other Bugzilla information you have supplied, or consult your StarTeam administrator.

If the trial connection to Bugzilla is successful, a confirmation dialog box displays, asking you if you want to map internal issue states to the states of the newly defined profile.

Click Yes to proceed with the related Mapping Issue States procedure. Click No to map issue states later.
Note: Mapping the existing issue states of Bugzilla to the states of Test Manager enables Test Manager to list issues correctly when querying internal and external issues.

Related Concepts Issue Tracking Profiles Related Procedures Managing Bugzilla Issue Tracking Profiles Mapping Issue States Related Reference Issue Tracking Profiles Page

289

Mapping Issue States


Describes how to map the existing issue states of an external issue tracking system to the issue states of Test Manager. After defining a new issue tracking profile, you should map the existing issue states of the external issue tracking system to the issue states of Test Manager. Doing this enables Test Manager to list issues correctly when querying internal and external issues.

To map issue states:


1 2 3

Click the Issue Tracking tab in Test Manager Settings . The Issue Tracking page opens, listing all of the issue tracking profiles that have been created for the system. Click the Edit state mapping icon of the issue tracking profile you want to edit. The Edit Status Mapping dialog box opens, listing all existing issue states of the external issue tracking software. These states are listed in the External column. The internal issue states of Test Manager are available in the list boxes in the Internal column. Map internal issue states to corresponding external issue states by selecting the respective entries from the list boxes. Once you have mapped each external issue state to an internal state, click OK to save your settings, or click Cancel to abort the operation.

Related Concepts Issue Tracking Profiles Related Reference Issue Tracking Profiles Page

290

Editing Bugzilla Issue Tracking Profiles


Describes how to modify existing Bugzilla issue tracking profiles. Tip: When the server or login credentials of your issue tracking system change, you must edit your issue tracking profile accordingly.

To edit an existing Bugzilla issue tracking profile:


1

To open the Edit Issue Tracking Profile dialog box: Edit the Name and the Description of the profile. Edit the Bugzilla Username and Password. These credentials are used to access your Bugzilla system. Edit the URL of your Bugzilla installation. Click OK. Test Manager attempts a trial connection to Bugzilla using the information you have provided. Note: If an error occurs, review the login credentials and the repository info that you have supplied, or consult your Bugzilla administrator.

2 3 4 5

If the trial connection to Bugzilla is successful, you are returned to the Issue Tracking page.

Related Concepts Issue Tracking Profiles Related Procedures Managing Bugzilla Issue Tracking Profiles Mapping Issue States Related Reference Issue Tracking Profiles Page

291

Deleting Issue Tracking Profiles


Describes how to remove existing issue tracking profiles.

To delete an existing issue tracking profile:


1 2 3

Select the Issue Tracking tab from Administration Settings in the menu tree. The Issue Tracking page displays, listing all of the issue tracking profiles that have been created for the system. Click the Delete icon of the issue tracking profile you want to delete. A confirmation dialog box displays. Click Yes to remove the selected profile. You are returned to the Issue Tracking page.

Related Concepts Issue Tracking Profiles Related Reference Issue Tracking Profiles Page

292

Managing IBM Rational ClearQuest Issue Tracking Profiles


This section describes how to configure IBM Rational ClearQuest issue tracking profiles to integrate with Test Manager.

In This Section
1

Adding a new IBM Rational ClearQuest issue tracking profile: Adding IBM Rational ClearQuest Issue Tracking Profiles Mapping the existing issue states of IBM Rational ClearQuest to the states of Test Manager: Mapping Issue States Editing an existing IBM Rational ClearQuest issue tracking profile: Editing IBM Rational ClearQuest Issue Tracking Profiles Deleting an existingIBM Rational ClearQuest issue tracking profile: Deleting Issue Tracking Profiles

Related Concepts Issue Tracking Profiles Related Reference Issue Tracking Profiles Page

293

Adding IBM Rational ClearQuest Issue Tracking Profiles


Describes how to create IBM Rational ClearQuest issue tracking profiles.

To add an IBM Rational ClearQuest issue tracking profile:


1

To launch the New Issue Tracking Profile dialog: Select IBM Rational ClearQuest from the Type list box. Type a valid Rational ClearQuest Username and Password. These credentials will be used to access your IBM Rational ClearQuest system. Enter the Repository Info of your Rational ClearQuest installation. This is the database name that is defined in the Rational ClearQuest client software. Specify the Record Type (the issue type of Rational ClearQuest). When entering an issue in Test Manager, Rational ClearQuest will save the issue with the issue type you define in this text box. The default issue type is Defect. Click OK. Test Manager attempts a trial connection to Rational ClearQuest using the information you have provided. Note: If an error occurs, review the login credentials and the repository info that you have supplied, or consult your Rational ClearQuest administrator.

2 3 4 5

If the trial connection to Rational ClearQuest is successful, a confirmation dialog box displays, asking you if you want to map internal issue states to the states of the newly defined profile.

Click Yes to proceed with the related Mapping Issue States procedure. Click No to map issue states later.

Related Concepts Issue Tracking Profiles Related Procedures Managing IBM Rational ClearQuest Issue Tracking Profiles Mapping Issue States Related Reference Issue Tracking Profiles Page

294

Mapping Issue States


Describes how to map the existing issue states of an external issue tracking system to the issue states of Test Manager. After defining a new issue tracking profile, you should map the existing issue states of the external issue tracking system to the issue states of Test Manager. Doing this enables Test Manager to list issues correctly when querying internal and external issues.

To map issue states:


1 2 3

Click the Issue Tracking tab in Test Manager Settings . The Issue Tracking page opens, listing all of the issue tracking profiles that have been created for the system. Click the Edit state mapping icon of the issue tracking profile you want to edit. The Edit Status Mapping dialog box opens, listing all existing issue states of the external issue tracking software. These states are listed in the External column. The internal issue states of Test Manager are available in the list boxes in the Internal column. Map internal issue states to corresponding external issue states by selecting the respective entries from the list boxes. Once you have mapped each external issue state to an internal state, click OK to save your settings, or click Cancel to abort the operation.

Related Concepts Issue Tracking Profiles Related Reference Issue Tracking Profiles Page

295

Editing IBM Rational ClearQuest Issue Tracking Profiles


Describes how to modify existing IBM Rational ClearQuest issue tracking profiles. Tip: When the server or login credentials of your issue tracking system change, you must edit your issue tracking profile accordingly.

To edit an existing IBM Rational ClearQuest issue tracking profile:


1

To launch the Edit Issue Tracking Profile dialog: Edit the Name of the profile. This is the name that is displayed in issue-tracking profile lists. Edit the Description of the profile. Edit the Rational ClearQuest Username and Password. These credentials are used to access your IBM Rational ClearQuest system. Edit the Repository Info of your Rational ClearQuest installation. This is the database name that is defined in the Rational ClearQuest client software. Change the Record Type (the issue type of Rational ClearQuest). When entering an issue in Test Manager, Rational ClearQuest saves the issue with the issue type you define in this field. Click OK. Test Manager attempts a trial connection to Rational ClearQuest using the information you have provided. Note: If an error occurs, review the login credentials and the repository info that you have supplied, or consult your Rational ClearQuest administrator.

2 3 4 5 6 7

If the trial connection to Rational ClearQuest is successful, you are returned to the Issue Tracking page.

Related Concepts Issue Tracking Profiles Related Procedures Managing IBM Rational ClearQuest Issue Tracking Profiles Mapping Issue States Related Reference Issue Tracking Profiles Page

296

Deleting Issue Tracking Profiles


Describes how to remove existing issue tracking profiles.

To delete an existing issue tracking profile:


1 2 3

Select the Issue Tracking tab from Administration Settings in the menu tree. The Issue Tracking page displays, listing all of the issue tracking profiles that have been created for the system. Click the Delete icon of the issue tracking profile you want to delete. A confirmation dialog box displays. Click Yes to remove the selected profile. You are returned to the Issue Tracking page.

Related Concepts Issue Tracking Profiles Related Reference Issue Tracking Profiles Page

297

Configuring Source Control Profiles


This section explains how to configure source control profiles to integrate Test Manager with external source control systems. In This Section Deleting Source Control Profiles Describes how to remove existing source control profiles. Managing Borland StarTeam Source Control Profiles This section describes how to configure Borland StarTeam source control profiles in SilkCentral. Managing Serena Version Manager (PVCS) Profiles This section describes how to configure Serena Version Manager (PVCS) source control profiles in SilkCentral. Managing CVS Profiles This section describes how to configure CVS source control profiles in SilkCentral. Managing Microsoft Visual SourceSafe (MSVSS) Profiles This section describes how to configure MSVSS source control profiles in SilkCentral. Managing Subversion Profiles This section describes how to configure Subversion (SVN) source control profiles in SilkCentral. Managing UNC Profiles This section describes how to configure UNC source control profiles in SilkCentral. Managing VFS Profiles This section describes how to configure VFS source control profiles in SilkCentral.

298

Deleting Source Control Profiles


Describes how to remove existing source control profiles.

To remove a source control profile:


1 2 3

Choose the Source Control tab from Test Manager Settings in the menu tree. The Source Control page displays, listing all of the source control profiles that have been created for the system. Click the Delete icon of the source control profile you wish to delete. A confirmation dialog box displays. Click Yes.

You are returned to the Source Control page. Related Concepts Source Control Profiles Related Reference Source Control Profiles Page

299

Managing Borland StarTeam Source Control Profiles


This section describes how to configure Borland StarTeam source control profiles in SilkCentral.

In This Section
1

Adding a new StarTeam profile: Adding StarTeam Source Control Profiles Editing an existing StarTeam profile: Editing StarTeam Source Control Profiles Deleting a StarTeam profile: Deleting Source Control Profiles

Related Concepts Source Control Profiles Related Reference Source Control Profiles Page

300

Adding StarTeam Source Control Profiles


Describes how to create StarTeam source control profiles.

To create a StarTeam source control profile:


1

To open the New Source Control Profile dialog box: Select StarTeam from the Source control system list box. Type the Hostname of your StarTeam server. Type the port that is to be used to connect to the StarTeam server. If the port is not changed, use the default port 49201.

2 3

4 5 6

Type a valid StarTeam Username and Password. Specify if the profile supports Encryption. Type the Working folder to which the Test Manager execution server should copy the source files. The working folder must be a local path. Type the Project path you want this profile to use. Alternative: Click Browse next to the Project path text box to connect to the StarTeam system that uses the credentials you have entered.

Click OK.

Note: If an error occurs, review the repository path and the StarTeam login credentials you have supplied. Or contact your StarTeam administrator. Test Manager attempts a trial connection to StarTeam using the information you have provided. If the trial connection to StarTeam is successful, you are returned to the Source Control page. Related Concepts Source Control Profiles Related Procedures Managing Borland StarTeam Source Control Profiles Editing StarTeam Source Control Profiles Deleting Source Control Profiles Related Reference Source Control Profiles Page

301

Editing StarTeam Source Control Profiles


Describes how to modify existing StarTeam source control profiles.

To modify a StarTeam source control profile:


1

To open the New Source Control Profile dialog box: Choose from the following options:

Edit the Name of the profile. This is the name that will be displayed in the Test Manager GUI. Edit the Hostname of your StarTeam server. Edit the port that is to be used to connect to the StarTeam server. If the port is not changed, use the default
port 49201.

Edit the StarTeam Username and Password. Specify if the profile supports Encryption. Edit the Working folder to which the Test Manager execution server is to copy the source files as required.
The working folder must be a local path.

Edit the Project path you want this profile to use.


3

Click OK.

Note: If an error occurs, review the repository path and the StarTeam login credentials you have supplied. Or contact your StarTeam administrator. Test Manager attempts a trial connection to StarTeam using the information you have provided. If the trial connection to StarTeam is successful, you are returned to the Source Control page. Related Concepts Source Control Profiles Related Procedures Managing Borland StarTeam Source Control Profiles Adding StarTeam Source Control Profiles Deleting Source Control Profiles Related Reference Source Control Profiles Page

302

Deleting Source Control Profiles


Describes how to remove existing source control profiles.

To remove a source control profile:


1 2 3

Choose the Source Control tab from Test Manager Settings in the menu tree. The Source Control page displays, listing all of the source control profiles that have been created for the system. Click the Delete icon of the source control profile you wish to delete. A confirmation dialog box displays. Click Yes.

You are returned to the Source Control page. Related Concepts Source Control Profiles Related Reference Source Control Profiles Page

303

Managing Serena Version Manager (PVCS) Profiles


This section describes how to configure Serena Version Manager (PVCS) source control profiles in SilkCentral.

In This Section
1

Adding a new PVCS profile: Adding PVCS Source Control Profiles Editing an existing PVCS profile: Editing PVCS Source Control Profiles Deleting a PVCS profile: Deleting Source Control Profiles

Related Concepts Source Control Profiles Related Reference Source Control Profiles Page

304

Adding PVCS Source Control Profiles


Describes how to create PVCS source control profiles.

To create a PVCS source control profile:


1

To open the New Source Control Profile dialog box: Select PVCS from the Source control system list box. Type the UNC path of the PVCS Repository you want to access. If you do not know the UNC path of the repository, consult your PVCS administrator. Type a valid UNC username and UNC password. These credentials are required to access the UNC path of your repository. Type the Working folder to which the Test Manager execution server should copy the source files. The working folder must be a local path. for example C:\TempSources\. Type the Execution path. This is the local path of the PVCS installation, where the command line tool pcli.exe is located. The default path is C:\Program Files\Serena\vm\win32\bin. Note: Note: The PVCS client software must be installed on the front-end server and each execution server. PVCS must be installed in identical paths on each machine. For example, if you install PVCS on the Test Manager front-end server to C:\Program Files\Serena\, you must install PVCS in the same path on the execution servers.

2 3 4 5 6

7 8

Type a valid PVCS Username and Password. These credentials will be used to access your PVCS repository. Type the Project path you want this profile to use. Alternative: Click Browse next to the Project path text box to connect to the PVCS system that uses the credentials you have entered. The Select Project Path dialog box opens. Select the desired project path in the tree view and click OK. Leaving this text box empty sets the project path to the root directory.

Click OK.

Note: If an error occurs, review the UNC repository path, the UNC login credentials, the execution path info, and the PVCS login credentials you have supplied. Or contact your PVCS administrator. Test Manager attempts a trial connection to PVCS using the information you have provided. If the trial connection to PVCS is successful, you are returned to the Source Control page.

305

Related Concepts Source Control Profiles Related Procedures Managing Serena Version Manager (PVCS) Profiles Editing PVCS Source Control Profiles Deleting Source Control Profiles Related Reference Source Control Profiles Page

306

Editing PVCS Source Control Profiles


Describes how to modify existing PVCS source control profiles.

To modify a PVCS source control profile:


1

To launch the New Source Control Profile dialog: Choose from the following options:

Edit the Name of the profile . This is the name that will be displayed in the Test Manager GUI. Edit the UNC path of the PVCS Repository. If you do not know the UNC path of the repository, please
consult your PVCS administrator.

Edit theUNC username and UNC password as required. These credentials are required to access the
repository UNC path you specified above.

Edit the Working folder to which the Test Manager execution server should copy the source files. The
working folder must be a local path. for example, C:\TempSources\.
3

Edit the Execution path. This is the local path of the PVCS installation, where the command line tool pcli.exe is located. The default path is C:\Program Files\Merant\vm\win32\bin. Note: Note: The PVCS client software must be installed on the front-end server and each execution server. PVCS must be installed in identical paths on each machine. For example. if you install PVCS on the Test Manager front-end server to C:\Program Files\Merant\, you must install PVCS in the same path on the execution servers.

4 5 6

Edit the PVCS Username and Password. These credentials will be used to access your PVCS repository. Edit the Project path. Click OK.

Note: If an error occurs, review the path to the UNC, the UNC login credentials, the PVCS login credentials, and the execution path info that you have supplied. Or contact your PVCS administrator. Test Manager attempts a trial connection to PVCS using the information you have provided. If the trial connection to PVCS is successful, you are returned to the Source Control page. Related Concepts Source Control Profiles Related Procedures Managing Serena Version Manager (PVCS) Profiles Adding PVCS Source Control Profiles Deleting Source Control Profiles Related Reference Source Control Profiles Page

307

Deleting Source Control Profiles


Describes how to remove existing source control profiles.

To remove a source control profile:


1 2 3

Choose the Source Control tab from Test Manager Settings in the menu tree. The Source Control page displays, listing all of the source control profiles that have been created for the system. Click the Delete icon of the source control profile you wish to delete. A confirmation dialog box displays. Click Yes.

You are returned to the Source Control page. Related Concepts Source Control Profiles Related Reference Source Control Profiles Page

308

Managing CVS Profiles


This section describes how to configure CVS source control profiles in SilkCentral.

In This Section
1

Adding a new CVS profile: Adding CVS Source Control Profiles Editing an existing CVS profile: Editing CVS Source Control Profiles Deleting a CVS profile: Deleting Source Control Profiles

Related Concepts Source Control Profiles Related Reference Source Control Profiles Page

309

Adding CVS Source Control Profiles


Describes how to create CVS source control profiles.

To create a CVS source control profile:


1

To open the New Source Control Profile dialog box: Choose CVS from the Source control system list box. Type the CVS server name or IP address in the Hostname text box. Type the port that is to be connected to in the Port text box. Specify the connection method in the Method text box. Currently, the ext, pserver, and local connection methods are supported. This makes the Port setting optional. Specify the URL of the CVS Repository you want to access. For example, /var/lib/cvs. If you do not know the URL of the repository, please consult your CVS administrator. Type a valid CVS Username and Password. These credentials will be used to access your CVS repository. Note that these settings are optional when using the ext connection method. Specify the CVS Module that is to be used, then enter the Working folder to which the Test Manager execution server should copy the source files. The working folder must be a local path, e.g., C:\TempSources\. Optional: Type the Project path that you want this profile to use. If the connection is successful, the Select Project Path dialog box will display. Leaving this field empty sets the project path to the root directory. Alternative: Click Browse next to the Project path text box to connect to the CVS system that uses the credentials you have entered. The Select Project Path dialog box opens. Select the desired project path in the tree view and click OK. Leaving this text box empty sets the project path to the root directory.

2 3

4 5 6 7 8

Click OK. Note: If an error occurs, review the repository path and the CVS login credentials you have supplied. Or contact your CVS administrator.

Test Manager attempts a trial connection to CVS using the information you have provided. If the trial connection to CVS is successful, you are returned to the Source Control page.

Related Concepts Source Control Profiles Related Procedures Managing CVS Profiles Editing CVS Source Control Profiles Deleting Source Control Profiles Related Reference Source Control Profiles Page

310

Editing CVS Source Control Profiles


Describes how to modify existing CVS source control profiles.

To modify a CVS source control profile:


1

To open the New Source Control Profile dialog box: Choose from the following options:

Edit the Name of the profile. This is the name that is displayed in the Test Manager GUI. Edit the CVS server name or IP address in the Hostname text box. Edit the port that is to be connected to in the Port text box. Edit the connection method in the Method text box. Currently, the ext, pserver, and local connection
methods are supported. This makes the Port setting optional.

Edit the URL of the CVS Repository you want to access. If you do not know the URL of the repository,
consult your CVS administrator.

Edit your CVS Username and Password. Edit the CVS Module that is to be used. Edit the Working folder to which the Test Manager execution server should copy the source files. The
working folder must be a local path. For example, C:\TempSources\.

Edit the Project path that you want this profile to use
Note:
3

The CVS Username and Password are optional when using the ext connection method.

Click OK. Note: If an error occurs, review the repository path and the CVS login credentials you have supplied. Or contact your CVS administrator.

Test Manager attempts a trial connection to CVS using the information you have provided. If the trial connection to CVS is successful, you are returned to the Source Control page.

Related Concepts Source Control Profiles Related Procedures Managing CVS Profiles Adding CVS Source Control Profiles Deleting Source Control Profiles Related Reference Source Control Profiles Page

311

Deleting Source Control Profiles


Describes how to remove existing source control profiles.

To remove a source control profile:


1 2 3

Choose the Source Control tab from Test Manager Settings in the menu tree. The Source Control page displays, listing all of the source control profiles that have been created for the system. Click the Delete icon of the source control profile you wish to delete. A confirmation dialog box displays. Click Yes.

You are returned to the Source Control page. Related Concepts Source Control Profiles Related Reference Source Control Profiles Page

312

Managing Microsoft Visual SourceSafe (MSVSS) Profiles


This section describes how to configure MSVSS source control profiles in SilkCentral.

In This Section
1

Adding a new MSVSS profile: Adding MSVSS Source Control Profiles Editing an existing MSVSS profile: Editing MSVSS Source Control Profiles Deleting a MSVSS profile: Deleting Source Control Profiles

Related Concepts Source Control Profiles Related Reference Source Control Profiles Page

313

Adding MSVSS Source Control Profiles


Describes how to create MSVSS source control profiles. Tip: SourceSafe clients must be installed on all front-end, application, and execution servers.

To create a MSVSS source control profile:


1

To open the New Source Control Profile dialog box: Select MSVSS or MSVSS (cmd line) from the Source control system list box. MSVSS (cmd line) utilizes the MSVSS command line plug-in, which works exactly like MSVSS, except that SilkCentral users are automatically logged out of MSVSS when the user logs out from SilkCentral. When selecting MSVSS, SilkCentral users remain logged in to MSVSS for an indefinite time.

If you selected MSVSS (cmd line), specify the location of the SourceSafe executable ss.exe. SourceSafe must be installed identically on all execution servers and the front-end server. This allows you to specify a definite path. For example, C:\Program Files\Microsoft Visual Studio\VSS\win32\ss.exe. If SourceSafe is installed in different locations, proceed as explained in sub task To configure the location of a SourceSafe client below.

In the SourceSafe database (srcsafe.ini) text box, type the UNC path and file name of the SourceSafe configuration file you want to access. Alternative: Click Browse to locate the SourceSafe configuration file. Note: SourceSafe configuration files use the name srcsafe.ini.

5 6 7 8

Type a valid UNC username and UNC password. These credentials are required to access the UNC path of the configuration file. Type the Working folder to which the Test Manager execution server should copy the source files. The working folder must be a local path. for example, C:\TempSources\. Type a valid SourceSafe Username and Password. These credentials will be used to access your MSVSS database. Type the Project path you want this profile to use. Alternative: Click Browse next to the Project path text box to connect to the MSVSS system that uses the credentials you have entered.

Click OK.

Note: If an error occurs, review the UNC path of the configuration file, the UNC login credentials, and the MSVSS login credentials you have supplied. Or contact your MSVSS administrator. Test Manager attempts a trial connection to CVS using the information you have provided. If the trial connection to CVS is successful, you are returned to the Source Control page.

To configure the location of a SourceSafe client:


1 2

In the SourceSafe executable text box, type ss.exe without any path information. On each execution server and on the front-end server, type the local path of the SourceSafe executable Settings Control Panel System. ss.exe to the Windows system path. To do this, click Start

314

3 4 5 6

The System Properties dialog box displays. Select the Advanced tab and click Environment Variables. The Environment Variables dialog box displays. Select the Path variable in the System variables section and click Edit. Add the local path of the SourceSafe executable to the list of existing Variable values. You can append a new variable value to existing values by entering a semicolon (;) followed by the path information. Repeat this procedure for each execution server and for the front-end server.

Related Concepts Source Control Profiles Related Procedures Managing Microsoft Visual SourceSafe (MSVSS) Profiles Editing MSVSS Source Control Profiles Deleting Source Control Profiles Related Reference Source Control Profiles Page

315

Editing MSVSS Source Control Profiles


Describes how to modify existing MSVSS source control profiles.

To modify a MSVSS source control profile:


1

To open the New Source Control Profile dialog box: Edit the Name of the profile. This is the name that will be displayed in the Test Manager GUI. If your selected Source control system is MSVSS (cmd line), you can change the location of the SourceSafe executable ss.exe. Borland recommends to install SourceSafe identically on all execution servers and the front-end server. This enables you to specify a definite path. For example C:\Program Files \Microsoft Visual Studio\VSS\win32\ss.exe. If SourceSafe is installed in different locations, proceed as explained in sub task To configure the location of a SourceSafe client in the related procedure Adding MSVSS Source Control Profiles.

2 3

In the SourceSafe database (srcsafe.ini) text box, edit the UNC path and file name of the SourceSafe configuration file, or click Browse to locate the file. If you do not know the location of the configuration file, consult your SourceSafe administrator. Note: SourceSafe configuration files use the name srcsafe.ini.

5 6 7 8

Edit the UNC username and UNC password. These credentials are required to access your configuration files UNC path. Edit the Working folder to which the Test Manager execution server should copy the source files. The working folder must be a local path. For example C:\TempSources\. Edit the Username and Password. These credentials will be used to access your MSVSS database. Edit the Project path you want this profile to use. Alternative: Click Browse next to the Project path text box to connect to the PVCS system that uses the credentials you have entered.

Click OK.

Note: If an error occurs, review the path to the UNC configuration file, the UNC login credentials, and the MSVSS login credentials that you have supplied. Or contact your MSVSS administrator. Test Manager attempts a trial connection to MSVSS using the information you have provided. If the trial connection to MSVSS is successful, you are returned to the Source Control page. Related Concepts Source Control Profiles Related Procedures Managing Microsoft Visual SourceSafe (MSVSS) Profiles Adding MSVSS Source Control Profiles Deleting Source Control Profiles Related Reference Source Control Profiles Page

316

Deleting Source Control Profiles


Describes how to remove existing source control profiles.

To remove a source control profile:


1 2 3

Choose the Source Control tab from Test Manager Settings in the menu tree. The Source Control page displays, listing all of the source control profiles that have been created for the system. Click the Delete icon of the source control profile you wish to delete. A confirmation dialog box displays. Click Yes.

You are returned to the Source Control page. Related Concepts Source Control Profiles Related Reference Source Control Profiles Page

317

Managing Subversion Profiles


This section describes how to configure SVN source control profiles in SilkCentral.

In This Section
1

Adding a new SVN profile: Adding Subversion Source Control Profiles Editing an existing SVN profile: Editing Subversion Source Control Profiles Deleting an SVN profile: Deleting Source Control Profiles

Related Concepts Source Control Profiles Related Reference Source Control Profiles Page

318

Adding Subversion Source Control Profiles


Describes how to create Subversion source control profiles.

To create a Subversion source control profile:


1

To open the New Source Control Profile dialog box: Choose Subversion from the Source control system list box. Type the URL of the Subversion Repository you want to access. If you do not know the URL of the repository, consult your Subversion administrator. Type a valid Subversion Username and Password. These credentials will be used to access your Subversion repository. Type the Working folder to which the Test Manager execution server should copy the source files. The working folder must be a local path. For example, C:\TempSources\. Optional: Type the Project path that you want this profile to use. If the connection is successful, the Select Project Path dialog box will display. Leaving this text box empty sets the project path to the root directory. Alternative: Click Browse next to the Project path text box to connect to the Subversion system that uses the credentials you have entered. The Select Project Path dialog box opens. Select the desired project path in the tree view and click OK. Leaving this text box empty sets the project path to the root directory.

2 3

4 5 6

Click OK. Note: If an error occurs, review the repository path and the Subversion login credentials you have supplied. Or contact your Subversion administrator.

Test Manager attempts a trial connection to Subversion using the information you have provided. If the trial connection to Subversion is successful, you are returned to the Source Control page.

Related Concepts Source Control Profiles Related Procedures Managing Subversion Profiles Editing Subversion Source Control Profiles Deleting Source Control Profiles Related Reference Source Control Profiles Page

319

Editing Subversion Source Control Profiles


Describes how to modify existing Subversion source control profiles.

To modify a Subversion source control profile:


1

To launch the New Source Control Profile dialog: Choose from the following options:

Edit the Name of the profile. This is the name that will be displayed in the Test Manager GUI. Edit the URL of the Subversion repository you want to access. If you do not know the URL of the repository,
please consult your Subversion administrator.

Edit your Subversion Username and Password. Edit the Working folder to which the Test Manager execution server should copy the source files. The
working folder must be a local path. For example, C:\TempSources\.

Edit the Project path you want this profile to use


Note: If you cannot check out files, when editing the URL of the Subversion Repository, delete the source control mirrors directory on your execution server. For example, C: \Documents and Settings\All Users\Application Data\Borland\SCC35 \SrcCtrlMirrors.

Click OK. Note: If an error occurs, review the repository path and the Subversion login credentials you have supplied.

Test Manager attempts a trial connection to Subversion using the information you have provided. If the trial connection to Subversion is successful, you are returned to the Source Control page, where the new profile is listed.

Related Concepts Source Control Profiles Related Procedures Managing Subversion Profiles Adding Subversion Source Control Profiles Deleting Source Control Profiles Related Reference Source Control Profiles Page

320

Deleting Source Control Profiles


Describes how to remove existing source control profiles.

To remove a source control profile:


1 2 3

Choose the Source Control tab from Test Manager Settings in the menu tree. The Source Control page displays, listing all of the source control profiles that have been created for the system. Click the Delete icon of the source control profile you wish to delete. A confirmation dialog box displays. Click Yes.

You are returned to the Source Control page. Related Concepts Source Control Profiles Related Reference Source Control Profiles Page

321

Managing UNC Profiles


This section describes how to configure UNC source control profiles in SilkCentral.

In This Section
1

Adding a new UNC profile: Adding UNC Source Control Profiles Editing an existing UNC profile: Editing UNC Source Control Profiles Deleting a UNC profile: Deleting Source Control Profiles

Related Concepts Source Control Profiles Related Reference Source Control Profiles Page

322

Adding UNC Source Control Profiles


Describes how to create UNC source control profiles.

To create a UNC source control profile:


1

To open the New Source Control Profile dialog box: Select UNC from the Source control system list box. Type the UNC path that you want to access. This is the path to the location where your test definition sources are located. Type the Working folder to which the Test Manager execution server should copy the source files. The working folder must be a local path. For Example C:\TempSources\. Type a valid UNC Username and Password. These credentials will be used to access your UNC repository. Click OK.

2 3 4 5 6

Note: If an error occurs, review the repository path and the UNC login credentials you have supplied. Or contact your UNC administrator. Test Manager attempts a trial connection to UNC using the information you have provided. If the trial connection to UNC is successful, you are returned to the Source Control page. Related Concepts Source Control Profiles Related Procedures Managing UNC Profiles Editing UNC Source Control Profiles Deleting Source Control Profiles Related Reference Source Control Profiles Page

323

Editing UNC Source Control Profiles


Describes how to modify existing UNC source control profiles.

To modify a UNC source control profile:


1

To launch the New Source Control Profile dialog: Choose from the following options:

Edit the Name of the profile. This is the name that will be displayed in the Test Manager GUI. Edit the UNC path. This is the path to where your test definition sources are located. Edit the Working folder to which the Test Manager execution server should copy the source files. The
working folder must be a local path. For example, C:\TempSources\.

Edit the UNC Username and Password. These credentials are required to access your UNC repository.
3

Click OK.

Note: If an error occurs, review the repository path and the UNC login credentials you have supplied. Test Manager attempts a trial connection to UNC using the information you have provided. If the trial connection to UNC is successful, you are returned to the Source Control page, where the new profile is listed. Related Concepts Source Control Profiles Related Procedures Managing UNC Profiles Adding UNC Source Control Profiles Deleting Source Control Profiles Related Reference Source Control Profiles Page

324

Deleting Source Control Profiles


Describes how to remove existing source control profiles.

To remove a source control profile:


1 2 3

Choose the Source Control tab from Test Manager Settings in the menu tree. The Source Control page displays, listing all of the source control profiles that have been created for the system. Click the Delete icon of the source control profile you wish to delete. A confirmation dialog box displays. Click Yes.

You are returned to the Source Control page. Related Concepts Source Control Profiles Related Reference Source Control Profiles Page

325

Managing VFS Profiles


This section describes how to configure VFS source control profiles in SilkCentral.

In This Section
1

Adding a new VFS profile: Adding VFS Source Control Profiles Editing an existing VFS profile: Editing VFS Source Control Profiles Deleting a VFS profile: Deleting Source Control Profiles

Related Concepts Source Control Profiles Related Reference Source Control Profiles Page

326

Adding VFS Source Control Profiles


Describes how to create VFS source control profiles.

To create a VFS source control profile:


1

To open the New Source Control Profile dialog box: Select VFS from the Source control system list box. Type the URL of the VFS Repository you want to access. Specify the appropriate protocol type in the URL:

2 3

FTP - ftp://<ftp server URL> HTTP - http://<http server URL> SMB - smb://<Samba server url>
Note: HTTP, FTP and SMB are also supported for zipped files. In order to point to a zipped file the URL must be adjusted to <zipped file type>:<protocol>://<server URL pointing to zipped file> to include the type of the zipped file. For example, zip:http://193.80.200.135/<path>/archive.zip or jar:http:// 193.80.200.135/<path>/archive.jar.

Type a valid VFS Username and Password. These credentials will be used to access your VFS repository. The SMB protocol allows including the domain name in the username in the following form: domain/ username. Enter the Working folder to which the Test Manager execution server should copy the source files. The working folder must be a local path. For example, C:\TempSources\. Optional: Type the Project path that you want this profile to use. If the connection is successful, the Select Project Path dialog box will display. Leaving this text box empty sets the project path to the root directory. Alternative: Click Browse next to the Project path text box to connect to the VFS system that uses the credentials you have entered. The Select Project Path dialog box opens. Select the desired project path in the tree view and click OK. Leaving this text box empty sets the project path to the root directory.

5 6

Click OK.

Note: If an error occurs, review the repository path and the VFS login credentials you have supplied. Or contact your VFS administrator. Test Manager attempts a trial connection to VFS using the information you have provided. If the trial connection to VFS is successful, you are returned to the Source Control page.

327

Related Concepts Source Control Profiles Related Procedures Managing VFS Profiles Editing VFS Source Control Profiles Deleting Source Control Profiles Related Reference Source Control Profiles Page

328

Editing VFS Source Control Profiles


Describes how to modify existing VFS source control profiles.

To modify a VFS source control profile:


1

To launch the New Source Control Profile dialog: Choose from the following options:

Edit the Name of the profile.. This is the name that will be displayed in the Test Manager GUI. Edit the URL of the VFS Repository you want to access. Edit the VFS Username and Password. These credentials will be used to access your VFS repository. Type the Working folder to which the Test Manager execution server should copy the source files. The
working folder must be a local path. for example, C:\TempSources\.

Edit the Project path you want this profile to use.


3

Click OK.

Note: If an error occurs, review the repository path and the VFS login credentials you have supplied. Or contact your VFS administrator. Test Manager attempts a trial connection to VFS using the information you have provided. If the trial connection to VFS is successful, you are returned to the Source Control page. Related Concepts Source Control Profiles Related Procedures Managing VFS Profiles Adding VFS Source Control Profiles Deleting Source Control Profiles Related Reference Source Control Profiles Page

329

Deleting Source Control Profiles


Describes how to remove existing source control profiles.

To remove a source control profile:


1 2 3

Choose the Source Control tab from Test Manager Settings in the menu tree. The Source Control page displays, listing all of the source control profiles that have been created for the system. Click the Delete icon of the source control profile you wish to delete. A confirmation dialog box displays. Click Yes.

You are returned to the Source Control page. Related Concepts Source Control Profiles Related Reference Source Control Profiles Page

330

Configuring Project Settings


To customize project settings:
1

Navigate to Test Manager Note:

Settings .

If you have not already selected a project, a warning message will appear, asking you to select a project. Select the project for which you want to define global settings.

2 3 4

Select the Project Settings tab to view the current settings. The Project Settings page displays the current project settings. Click Edit to modify the current project settings. The Edit Project Settings dialog box displays. You can specify the following information:

Build Information File Name Build information files contain project information, including build number, Project Release Date Enter your projects planned release date in the format MM/DD/YYYY. File Extensions to ignore in Results Specify result file types or other file types that should not be
saved as results for test executions. Note: Note: File extensions must be separated by commas (for example, xlg, *_, res). Changes made in the Build Information File Name and File Extensions to ignore in Results fields will not affect scheduled test definitions. To redistribute tasks to execution servers, you must reschedule test definitions, or disconnect from and reconnect to the database.

build log location, error log location, and build location. Enter the name of your projects build information file in this field. All test executions will read the build information from this specified file.

Click Save to save your project settings.

Related Concepts Settings Configuration Related Procedures Configuring Projects - Quick Start Task Configuring Test Manager Settings Related Reference Settings Unit Interface

331

Managing Requirements
This section explains how to work with requirements in Test Manager. In This Section Creating Requirements This section explains how to create requirements with Test Manager. Customizing Requirement Properties This section explains how to customize requirement properties with Test Manager. Integrating External RM Tools This section explains how to integrate an external requirements management tool with Test Manager. Collapsing or Expanding the Requirements tree Describes how to consolidate and display levels of the hierarchy based on your viewing needs. Switching Between Full and Direct Coverage Modes Describes how to switch between full and direct coverage modes.

332

Creating Requirements
This section explains how to create requirements with Test Manager. In This Section Managing Requirement Attachments This section explains how to manage requirement attachments with Test Manager. Configuring Requirement Types Describes how to configure a requirement type. Creating Requirements Describes how to create requirements directly in Test Manager. Assigning Test Definitions from Grid View to Requirements Describes how to assign test definitions from Grid View to requirements. Assigning Test Definitions to Requirements Manually Describes how to manually assign test definitions to requirements. Creating Child Requirements Describes how to create child requirements. Editing Requirements Describes how to edit requirements. Finding Requirement Properties How to find requirement properties. Generating Test Plans from Requirements View How to generate a new test plan from Requirements View. Locating Assigned Test Definitions in the Test Plan Tree How to locate assigned test definitions in the Test Plan tree. Marking Requirements as Obsolete How to make a requirement as obsolete, rather than deleting it. Removing Test Definition Assignments How to remove a test-definition assignment from a requirement. Replacing Requirement Properties How to replace a requirement property. Sorting the Assigned Test Definitions Tab How to sort test definitions on the Assigned Test Definitions tab. Tracking the History of a Requirement How to track the history of a requirement.

333

Managing Requirement Attachments


This section explains how to manage requirement attachments with Test Manager. In This Section Attaching a File to a Requirement Describes how to attach a file to a requirement. Attaching a Link to a Requirement Describes how to attach a link to a requirement. Deleting a Requirement Attachment Describes how to delete a requirement attachment. Editing a Requirement Attachment Description Describes how to edit a requirement attachment description. Viewing a Requirement Attachment Describes how to view a requirement's attachment.

334

Attaching a File to a Requirement


To attach a file to a requirement
1 2 3

Click Requirements on the workflow bar. Select a requirement in the Requirement tree view. Select the Attachments tab. When requirements management integration has been enabled between a Test Manager project and a CaliberRM project, the Attachments tab (Requirements Attachments) includes an Open CaliberRM button, which enables you to manage requirement attachments directly in CaliberRM.

4 5 6 7

Click Upload File to open the Upload File dialog box. Using Browse, select a file from your local file system. Enter a meaningful Description for the attachment. Click OK to upload the attachment to the server and associate it with the selected requirement.

Note: Attaching files to a test plan element may not work in Mozilla Firefox. Firefox requires usage of three slashes (for example: "file:///") for a file link, while other browsers require only two (for example: "file://"). Aditionally, Firefox includes a security feature blocking links from remote files to local files and directories. For more information, see http://kb.mozillazine.org/Firefox_:_Issues_:_Links_to_Local_Pages_Don't_Work Related Concepts Attachments Requirements Management Related Procedures Managing Requirements - Quick Start Task Managing Requirements Related Reference Requirement Attachments tab

335

Attaching a Link to a Requirement


To attach a link to a requirement
1 2 3

Click Requirements on the workflow bar. Select a requirement in the requirement tree view. Select the Attachments tab. When requirements management integration has been enabled between a Test Manager project and a CaliberRM project, the Attachments tab (Requirements Attachments) includes an Open CaliberRM button, which enables you to manage requirement attachments directly in CaliberRM.

4 5 6 7

Click Attach Link to open the Attach Link dialog box. Enter a URL in the Name field. Enter a meaningful description for the attached link. Click OK to associate the link with the selected requirement.

Related Concepts Attachments Requirements Management Related Procedures Managing Requirements - Quick Start Task Managing Requirements Related Reference Requirement Attachments tab

336

Deleting a Requirement Attachment


To delete a requirement attachment:
1 2 3

Click Requirements on the workflow bar. Select the requirement in the Requirement tree view for which you want to delete an attachment. Select the Attachments tab to see a list of all attachments that are associated with the requirement. When requirements management integration has been enabled between a Test Manager project and a CaliberRM project, the Attachments tab (Requirements Attachments) includes an Open CaliberRM button, which enables you to manage requirement attachments directly in CaliberRM.

4 5

Click the Delete icon of the attachment you want to delete. Click Yes on the confirmation dialog to delete the attachment from the project. Note: Only one attachment at a time can be deleted.

Related Concepts Attachments Requirements Management Related Procedures Managing Requirements - Quick Start Task Managing Requirements Related Reference Requirement Attachments tab

337

Editing a Requirement Attachment Description


To edit a requirement attachment description
1 2 3

Click Requirements on the workflow bar. Select the requirement in the Requirement tree view for which you want to edit a requirement attachment description. Select the Attachments tab to see the list of attachments that are associated with the requirement. When requirements management integration has been enabled between a Test Manager project and a CaliberRM project, the Attachments tab (Requirements Attachments) includes an Open CaliberRM button, which enables you to manage requirement attachments directly in CaliberRM.

4 5 6

Select the attachment for which you want to edit the description and click Edit. Edit the description on the Edit File Attachment dialog box. Click OK to save your changes.

Related Concepts Attachments Requirements Management Related Procedures Managing Requirements - Quick Start Task Managing Requirements Related Reference Requirement Attachments tab

338

Viewing a Requirement Attachment


To view a requirement attachment
1 2 3 4

Click Requirements on the workflow bar. From the Requirements tree view, select the requirement for which you want to view an attachment. Select the Attachments tab to see a list of all attachments that are associated with the requirement. Each attachment name serves as a link. File-attachment links open Save As dialog boxes, enabling you to download attachments to your local file system. Link-attachments link directly to link destinations in newly spawned browser windows. Note: When requirements management integration has been enabled between a Test Manager project and a CaliberRM project, the Attachments tab (Requirements Attachments) includes an Open CaliberRM button, which enables you to manage requirement attachments directly in CaliberRM.

Related Concepts Attachments Requirements Management Related Procedures Managing Requirements - Quick Start Task Managing Requirements Related Reference Requirement Attachments tab

339

Configuring Requirement Types


To configure requirement type
1

Click Requirements on the workflow bar. Note: Configuration of requirement type for CaliberRM, Requisite Pro and DOORS is only enabled for top-level requirements in the tree (requirements that are a direct child of the project node). All other requirements share the requirement type of their parents. A requirement without a configured requirement type is not available for upload. Import of requirements automatically configures appropriate requirement type.

2 3

From Requirements View, at the requirement level, select the Properties tab. Click Map Requirement to select a requirement type from the list. Requirement type is a categorization used by CaliberRM, Requisite Pro, and DOORS and is required for synchronization. Note: Map Requirement is only enabled when external requirements integration is enabled in the Settings unit (Integrations Configuration tab.) and if the requirement has not yet been uploaded to the external requirements management tool. Additionally, the option Enable upload of requirements to... must be enabled.

Click OK to save your settings and close the dialog box.

Related Concepts Requirements Management Related Procedures Managing Requirements - Quick Start Task Test Coverage Status Managing Requirements Related Reference Requirement Properties tab

340

Creating Requirements
Test Manager allows you to create new requirements, edit and delete existing requirements, and add custom property fields to requirements. Newly created Test Manager projects do not contain requirements.

To create a new requirement


1 2

Navigate to Test Manager

Requirements.

Click New Requirement on the toolbar. Note: If the project you are working with does not yet have any requirements associated with it, click the <Click here to add Requirements> link in the Requirements tree to open the New Requirement dialog box.

On the New Requirement dialog box, enter a meaningful Name and Description for the requirement. Note: Test Manager supports HTML formatting and cutting and pasting of HTML content for description fields.

4 5

Select the appropriate Priority, Risk, and Reviewed status from the list boxes. If custom requirements habe been defined, enter in the Custom Property text box any custom property data that you want tracked with this requirement. Note: The Priority, Risk, Reviewed, and any Custom Property fields will be configured automatically with the corresponding properties of the parent requirement if you check the Inherit from parent check boxes for these properties.

Click OK to create a new top-level requirement. Note: Alternatively, you can click OK and New Requirement to both save the newly created requirement and open the New Requirement dialog box to create an additional top-level requirement. Or, you can click OK and New Child Requirement to have the New Child Requirement dialog box open after the new top-level requirement is created.

Related Concepts Requirements Management Related Procedures Managing Requirements - Quick Start Task Creating Child Requirements Managing Requirements Related Reference Requirements Unit Interface HTML Support for Description Text Boxes

341

Assigning Test Definitions from Grid View to Requirements


The test definitions that are assigned to the selected requirement are listed on the Assigned Test Definitions tab (Requirements View only). The properties of the assigned test definitions that are listed in tabular view include:

Test Definition Name Test Definition Status Last Execution of the test definition

To assign one or more test definitions from the test plan Grid View to one or more requirements:
1 2 3 4 5 6 7 8

Click Test Plan on the workflow bar. Click Grid View on the toolbar Select the test definitions you want to assign to requirements. You can use your keyboard's Ctrl and Shift keys to select multiple test definitions using standard browser multi-select functions. Right-click the selected test definitions and choose Save Selection. Click Requirements on the workflow bar. Select the requirement to which you want to insert the selected test definitions. Choose Assigned Test Definitions. Click Assign Saved Selection. Note: Note: Only test definitions that reside in the requirements test container are assigned. You can assign the selected test definitions to more than one requirement. You can not assign them into requirements in a different project. The selection persists until you make a different selection or close Test Manager.

Related Concepts Requirements Management Related Procedures Managing Requirements - Quick Start Task Creating Requirements Assigning Test Definitions to Requirements Manually Related Reference Assigned Test Definitions tab

342

Assigning Test Definitions to Requirements Manually


To manually assign test definitions to requirements
1 2 3

Click Requirements on the workflow bar. In the Requirements tree view, select the requirement to which you want to assign test definitions. In Requirements View, select the Assigned Test Definitions tab. Note: The Available Test Definitions window can be expanded/collapsed by clicking the black triangular button on the window splitter (the left-hand edge of the window).

Click the arrow of any test definition you want to assign to the currently selected requirement. Clicking the arrow of a test container or test folder assigns the test definitions that are located in those containers or folders to the selected requirement (test definitions that are located within sub-folders of those containers and folders are also assigned).

Related Concepts Test Coverage Status Requirements Management Related Procedures Managing Requirements - Quick Start Task Managing Requirements Assigning Test Definitions from Grid View to Requirements Related Reference Assigned Test Definitions tab

343

Creating Child Requirements


Test Managers hierarchical Requirements tree allows you to create both top-level requirements and child requirements, which are sub-requirements of higher-level requirements. Upon creation, you have the option of specifying that child requirements inherit the properties of their parent requirements.

To create a child requirement


1 2 3 4

Click Requirements on the workflow bar. In the tree view, select the requirement under which you would like to create a child requirement. Select New Child Requirement to open the New Child Requirement dialog box. For each of the available requirement property fields, you have the option of having the child requirement inherit its values from its parent. By default, all Inherit from parent check boxes are checked, and so all parent traits are inherited by default. To specify a property value other than that held by its parent, uncheck the corresponding Inherit from parent check box to unlock that propertys list box or edit field. Then select the specific value that the child requirement is to have. In Document View, asterisks (*) are placed next to requirement properties for which values have been inherited from parent requirements. Note: Child requirements can be created at any level in the tree view hierarchy other than the top level. There is virtually no limit to the number of child requirements that can be inserted at a single hierarchy level.

Related Concepts Requirements Management Related Procedures Managing Requirements - Quick Start Task Creating Requirements Managing Requirements Related Reference Requirements Unit Interface

344

Editing Requirements
To edit requirement properties
1 2 3

Click Requirements on the workflow bar. Select a requirement in the Requirements tree. The properties of the selected requirement are displayed on the Properties tab. Click Edit Properties on the Properties tab to open the Edit Requirement dialog box. Note: The Edit Requirement dialog box can also be accessed through Edit toolbar and by rightclicking a requirement in the Requirements tree and selecting Edit.

4 5

Edit the values displayed on the Edit Requirements dialog box as required. The default behavior is to inherit values from the parent requirement. Uncheck Inherited from parent check boxes to disable value inheritance. Click OK to save your changes. Note: For details regarding creating, editing, and deleting custom requirement properties, see Custom Requirement Properties.

Related Concepts Requirements Management Related Procedures Managing Requirements - Quick Start Task Managing Requirements Related Reference Requirements Unit Interface

345

Finding Requirement Properties


The Requirements units Find command enables you to locate requirements that meet certain search criteria. The Replace command enables you to replace identified property data with alternate data that you specify. Both commands offer Find Next and Find Previous functions that allow you to step through all identified properties.

To find a requirement:
1 2

Click Requirements on the workflow bar. Select on the toolbar to open the Find dialog box. Note: This command can also be executed by right-clicking a requirement and selecting Find.

3 4

From the Find in list box, select a requirement property to be searched. This list is automatically populated with all standard property fields and any custom property fields you may have created. Define your search criteria in the Find what portion of the dialog box. Note: The Find what portion of the dialog box offers fields and list boxes with pre-populated values that are based on the property you select in the Find in list box above. The UI controls available in the Find what portion of the dialog box also vary based on the property type selected in the Find in list box. For example, selecting a custom date-type property enables one or more date fields. To specify an exact date, select exactly from the Find in list box. Then click next to the date field to specify a date using the calendar tool. Alternatively, you can select before or after from the list box. To select a date range, select between from the list box. Then click fields to specify start and end dates. next to the date

Selecting the Reviewed property enables a list box from which you can select either Yes or No. Selecting the Requirement name or Description property enables a text box in which you can enter a text string.
5

Click OK to begin your search. The first requirement that meets the search criteria will be highlighted in the tree view. Click Find Next on the Find dialog box to advance to the next requirement in the list that meets your search criteria. Click Find Previous on the Find dialog box to return to the previous requirement in the list that meets your search criteria.

Related Concepts Custom Requirement Properties Requirements Management Related Procedures Managing Requirements - Quick Start Task Customizing Requirement Properties Managing Requirements Related Reference Requirement Properties Page

346

Generating Test Plans from Requirements View


To generate a new test plan from Requirements View:
1

Click Requirements on the workflow bar. From Requirements View, with at least one requirement available in the Requirements tree, right-click the requirement or project node that is to be converted into a Test Plan tree. Select Generate Test Plan to display the Generate Test Plan from Requirements dialog box. This dialog box enables you to specify whether the leaves (lowest-level nodes) of the selected requirements subtree should be converted into test definitions or test folders; and whether the tree should be generated into a new test container or an existing container. Enter a name for the new test container in the Enter Name field and select a product from the Select Product list box to create the container within the active Test Manager project. The Select Product list box is populated with the products that are configured by a project manager. See SilkCentral Administration Module documentation or ask your project manager for detailed information. If you have defined a source control profile (see SilkCentral Administration Module documentation or ask your Test Manager administrator for detailed information) select the source control profile you want to use for managing the test definition sources from the Select Source Control Profile list box. To include all child requirements of the selected requirement in the test plan, check the Include child requirements check box (the default). To have the new test definitions that you generate automatically assigned to the requirements from which they are created, check the Assign newly generated Test Definitions to Requirements check box. If this option is not selected, test definitions must be manually associated with requirements. Note: This option is not available when checking Generate Test Folders from Requirement Tree leaves.

2 3

4 5 6

7 8

Click OK to create the test plan, which has the same structure as the Requirements tree. A message displays, asking if you want to switch directly to the Test Plan unit. Click Yes to view the test plan in Test Managers Test Plan unit, or click No to remain in the Requirements unit.

Related Concepts Test Plan Generation Requirements Management Related Procedures Managing Requirements - Quick Start Task Managing Requirements Related Reference Requirements Unit Interface

347

Locating Assigned Test Definitions in the Test Plan Tree


Locating assigned test definitions in the Test Plan tree:
1 2 3 4 5

Click Requirements on the workflow bar. Select a requirement in the Requirements tree that has at least one test definition assigned to it. Select the Assigned Test Definitions tab. In the Actions column of a test definition, click is stored in. to find out in which test folder or container the test definition

The corresponding test folder/container is then highlighted in the Test Plan window.

Related Concepts Requirements Management Related Procedures Managing Requirements - Quick Start Task Test Coverage Status Managing Requirements Related Reference Assigned Test Definitions tab

348

Marking Requirements as Obsolete


Rather than deleting requirements (destroying them permanently), it is sometimes preferable to mark them as obsolete. Obsolete requirements can optionally be enabled for viewing (or conversely hidden from view). Obsolete requirements appear in the Requirements tree in italics.

To mark a requirement as obsolete


1 2 3

Right-click the requirement you want to edit. Select Delete . Make sure that the Destroy permanently check box is not checked and click Yes.

To convert an obsolete requirement to active status:


1 2 3

Click Requirements on the workflow bar. Right-click a requirement in the tree view. Select Recover .

To permanently delete an obsolete requirement:


1 2 3 4 5

Click Requirements on the workflow bar. Select a requirement in the tree view. Select Delete . Make sure that the Destroy permanently check box is checked and click Yes. Click Yes on the Delete Requirement dialog box.

Related Concepts Requirements Management Related Procedures Managing Requirements - Quick Start Task Managing Requirements Related Reference Requirements Unit Interface

349

Removing Test Definition Assignments


To remove a test-definition assignment:
1 2 3 4

Click Requirements on the workflow bar. Select a requirement (in the Requirements tree) that has at least one test definition assigned to it. In the Actions column of the test definition you want to remove, click Delete. Click Yes on the confirmation dialog box to confirm deletion of the assignment. Note: To remove all test-definition assignments from the selected requirement, click Remove All.

Related Concepts Test Coverage Status Requirements Management Related Procedures Managing Requirements - Quick Start Task Managing Requirements Related Reference Assigned Test Definitions tab

350

Replacing Requirement Properties


To replace a requirement property:
1 2 3

Click Requirements on the workflow bar. Select a requirement in the Requirements tree. Click Replace on the toolbar to open the Replace dialog box. Note: This command can also be executed by right-clicking a requirement and selecting Replace.

From the Find in list box, select a requirement property to be searched. This list is automatically populated with all standard property fields and any custom property fields you may have created. 5 Define your search criteria in the Find what portion of the dialog box.
4

Note:

The Find what portion of the dialog box offers fields and list boxes with pre-populated values that are based on the property you select in the Find in list box above. The UI controls available in the Find what portion of the dialog box also vary based on the property type selected in the Find in list box. For example, selecting a custom date-type property enables one or more date fields. To specify an exact date, select exactly from the Find in list box. Then click next to the date field to specify a date using the calendar tool. Alternatively, you can select before or after from the list box. To select a date range, select between from the list box. Then click fields to specify start and end dates. next to the date

Selecting the Reviewed property enables a list box from which you can select either Yes or No. Selecting the Requirement name or Description property enables a text box in which you can enter a text string.
6 7 8

In the Replace with text box, enter the alternate property data that you want to have replace the identified data. Click OK to find the first instance of the property you want to replace. The first requirement that meets the search criteria will be highlighted in the tree view. Click Replace to replace only the selected instance of the property data. Click Replace all to replace all instances of the property data throughout all requirements in the project. Note: Using the Replace all option will overwrite inherited requirement properties with the new value, thus removing the inheritance setting of a child requirement. Use the Replace option only on a parent requirement if you want the child requirements to inherit the new value.

Click Find Next on the Find dialog box to advance to the next requirement in the list that meets your search criteria. Or click Find Previous on the Find dialog box to return to the previous requirement in the list that meets your search criteria.

351

Related Concepts Requirements Management Related Procedures Managing Requirements - Quick Start Task Managing Requirements Related Reference Requirement Properties Page

352

Sorting the Assigned Test Definitions Tab


To sort test definitions on the Assigned Test Definitions tab:
1 2 3

Click Requirements on the workflow bar. Select a requirement (in the Requirements tree) that has more than one test definition assigned to it. Click the column header of the property by which you want to sort the test definitions. A small upward- or downward-pointing arrow indicates both the column upon which the sort has been based and the direction of the sort (ascending or descending). If required, click the column header again to reverse the direction of the sort.

Related Concepts Test Coverage Status Requirements Management Related Procedures Managing Requirements - Quick Start Task Managing Requirements Related Reference Requirement Coverage tab Assigned Test Definitions tab

353

Tracking the History of a Requirement


To view a requirements history:
1 2 3

Click Requirements on the workflow bar. Select a requirement in the Requirements tree. Select Requirements views History tab. When requirements management integration has been enabled between a Test Manager project and a CaliberRM project, the History tab (Requirements History) includes a Open CaliberRM button, which enables you to view the history of synchronized requirements directly in CaliberRM.

The properties of all revisions that have been logged by Test Manager are displayed in tabular format.

Related Concepts Requirement History Requirements Management Related Procedures Managing Requirements - Quick Start Task Viewing Recent Changes Managing Requirements Related Reference Requirement History tab

354

Customizing Requirement Properties


This section explains how to customize requirement properties with Test Manager. In This Section Configuring Custom Requirement Properties How to create new custom requirement properties. Deleting Custom Requirement Properties Describes how to delete a custom requirement property. Editing Custom Requirement Properties Describes how to edit a custom requirement property.

355

Configuring Custom Requirement Properties


To create a new custom requirement property:
1 2 3 4 5 6

Navigate to Test Manager

Settings .

Select the Requirement Properties tab. Click New Property to display the New Custom Requirement Property dialog box. Enter a name for the new property in the Name field. Select the data Type of the new property (integer, string, Boolean, or Date) from the Type list box. Click OK to make your custom property available to all requirements in the active Test Manager project.

Related Concepts External Requirements Management Tools Requirements Management Related Procedures Managing Requirements - Quick Start Task Working with External Properties Managing Requirements Customizing Requirement Properties Finding Requirement Properties Related Reference Requirement Properties tab

356

Deleting Custom Requirement Properties


To delete a previously created custom requirement property
1 2 3 4

Navigate to Test Manager

Settings .

Select the Requirement Properties tab. Click the Delete icon to display the Delete Custom Requirement Property confirmation dialog box. Click Yes to confirm the deletion.

Related Concepts External Requirements Management Tools Requirements Management Related Procedures Managing Requirements - Quick Start Task Working with External Properties Managing Requirements Customizing Requirement Properties Related Reference Requirement Properties tab

357

Editing Custom Requirement Properties


To edit a previously created custom requirement property
1 2 3 4 5

Navigate to Test Manager

Settings .

Select the Requirement Properties tab. Click the name of the property you want to edit. The Edit Custom Requirement Property dialog box displays. Edit the name of the property in the Name field. Click OK to save your changes.

Related Concepts External Requirements Management Tools Requirements Management Related Procedures Managing Requirements - Quick Start Task Working with External Properties Managing Requirements Customizing Requirement Properties Related Reference Requirement Properties tab

358

Integrating External RM Tools


This section explains how to integrate an external requirements management tool with Test Manager. In This Section Enabling External Requirements Management Integration This section explains how to enable external requirements management integration with Test Manager. Working with CaliberRM This section explains how to work with CaliberRM's integration with Test Manager. Working with External Properties This section explains how to work with external properties in Test Manager. Deleting Property-Mapping Value Pairs Describes how to delete a property-mapping value pair. Disabling Requirements-Management Integration Describes how to disable integration with an external requirements-management tool. Editing Property Mapping Describes how to map property fields between Test Manager and an external requirements-management tool. Removing Requirements-Management Integration Describes how to remove integration with an external requirements-management tool. Synchronizing Requirements Across Tools Describes how to synchronize requirements between Test Manager and an externally configured requirements management tool.

359

Enabling External Requirements Management Integration


This section explains how to enable external requirements management integration with Test Manager. In This Section Enabling Integration with Borland CaliberRM How to enable integration with Borland CaliberRM: Enabling Integration with IBM Rational RequisitePro How to enable integration with IBM Rational RequisitePro: Enabling Integration with Telelogic DOORS How to enable integration with Telelogic DOORS:

360

Enabling Integration with Borland CaliberRM


To enable integration with Borland CaliberRM:
1 2 3 4 5 6

From the project to which you want to establish integration, select the Settings link on the menu tree. Select the Integrations Configuration tab. Click the Borland Caliber RM Configure button to display the Edit Configuration dialog box. Enter the Hostname of the machine where the external server is installed. Enter valid Username and Password credentials for the requirements management server. Click Test Connection to confirm that the host and user credentials you have entered are correct. You will receive a Test connection was successful dialog if the settings are correct. Click OK. Note: Consult your system administrator if you are not able to establish a connection.

Click Browse to advance to the Browse Projects dialog box.


7

From the Project field, select the external project with which the Test Manager project is to be integrated. The requirement types that are available with the selected project are automatically populated into the Requirement Types field. The baselines that are available with the selected project are automatically populated into the Baseline field. Select a baseline from the external project that should be integrated with the Test Manager project. Select one or more requirement types from the external project that should be integrated with the Test Manager project (Hold down the CTRL key to select multiple requirement types). Your selections are displayed on the Edit Configuration dialog box. Click OK. Back on the Edit Configuration dialog box, check the Enable creation of unassigned requirements check box to enable creation and editing of unmapped requirements in Test Manager projects that are configured for integration with CaliberRM. Check the Enable upload of requirements to CaliberRM check box to enable the upload of unmapped/unassigned requirements from Test Manager to CaliberRM. This allows you to upload additional previously unmapped requirement trees to CaliberRM and then have those requirements mapped within Test Manager. When this option is enabled, the Map Requirement button becomes enabled (Requirements Properties), enabling configuration of top level requirements for external requirement types, which is required when uploading unmapped requirements. Click OK to save your settings.

Related Concepts External Requirements Management Tools Baseline Support for CaliberRM Integration Requirements Management Related Procedures Managing Requirements - Quick Start Task Copying CaliberRM-Integrated Projects Working with External Properties Managing Requirements Customizing Requirement Properties Related Reference Requirement Properties tab

361

Enabling Integration with IBM Rational RequisitePro


To enable integration with IBM Rational RequisitePro:
1 2 3 4 5 6 7

From the project to which you want to establish integration, select the Settings link on the menu tree. Select the Integrations Configuration tab. Click the IBM Rational RequisitePro Configure button to display the Edit Configuration dialog box. Enter (or click Browse and select) the UNC project path to the machine where the external server is installed. Enter the UNC Username and UNC Password of the machine where the external server is installed. Enter valid User name and Password credentials for the requirements management server. ClickTest Connection to confirm that the host and user credentials you have entered are correct. You will receive a Test connection was successful dialog box if the settings are correct. Click OK. Note: Consult your system administrator if you are not able to establish a connection.

Click Edit Packages and Requirement Types to open the Browse Packages & Requirement Types dialog box. The packages and requirement types that are available with the selected project are automatically populated into the Packages and Requirement Types fields. In the Packages field, select one or more packages from the external project that should be integrated with the Test Manager project (Hold down the CTRL key to select multiple packages). In the Requirement types field, select one or more requirement types from the external project that should be integrated with the Test Manager project (Hold down the CTRL key to select multiple requirement types). Click OK. Your selections are then displayed on the Edit Configuration dialog. Note: Only requirements of explicitly selected packages will be synchronized. Selecting a parent package does not select the child packages of the parent.

Back on the Edit Configuration dialog box, check the Enable creation of unassigned requirements check box to enable creation and editing of unmapped requirements in Test Manager projects that are configured for integration with RequisitePro. Check the Enable upload of requirements to RequisitePro check box to enable the upload of unmapped/unassigned requirements from Test Manager to RequisitePro. This allows you to upload additional previously unmapped requirement trees to RequisitePro and then have those requirements mapped within Test Manager. When this option is enabled, the Map Requirement button becomes enabled (Requirements Properties), enabling configuration of top level requirements for external requirement types, which is required when uploading unmapped requirements. Click OK to save your settings.

362

Related Concepts External Requirements Management Tools Baseline Support for CaliberRM Integration Requirements Management Related Procedures Managing Requirements - Quick Start Task Working with External Properties Managing Requirements Customizing Requirement Properties Related Reference Requirement Properties tab

363

Enabling Integration with Telelogic DOORS


To configure integration of Test Manager and Telelogic DOORS, you must install the DOORS client on the Test Manager front-end server machine, and configure it as detailed below. If you use more than one front-end server machine, then the DOORS client must be installed to the same directory on each of the machines.

To install the DOORS client on the Test Manager front-end server machine:
1 2

Download the DOORS plug-in package (two zip archives: DoorsRMPlugin.zip and Tools. DoorsClientLibs.zip) from Help Create a new folder with the name testmanager in the \lib\dxl folder of your Telelogic DOORS client installation. The default pathname for this folder is C:\Program Files\Telelogic\DOORS_8.20\lib \dxl\testmanager. Extract all DOORS script files from DoorsClientLibs.zip to this folder. The plug-in package DoorsRMPlugin.zip is automatically installed to the Plugins folder of your Test Manager application server installation during the setup process. During startup of the application server, this plug-in will be published to all front-end servers.

3 4

To configure a project for requirements integration with Telelogic DOORS:


1 2 3

From the Test Manager project to which you want to establish integration, navigate to Settings Configuration.

Integrations

Click the Telelogic DOORS Integration Configure button to display the Edit Configuration dialog box. In the RM service URL field, enter the URL of Test Manager's DOORS requirement Web Service. The default value should point to the correct location already. Example: http://MySCTMHost:19120/services/doorsrequirementsmanagement Enter valid Username and Password credentials for the requirements management server. The default DOORS client installation path is displayed in the DOORS Installation Path field on the Edit Configuration dialog. If this path is not correct, click Browse to browse to and select the correct destination in the front-end server directory structure. Click Test Connection to confirm that the host and user credentials you have entered are correct. You will receive a Connection to Telelogic DOORS was successful message if the settings are correct. Click OK to proceed. Note: Consult your system administrator if you are not able to establish a connection.

4 5

Click the second Browse button (alongside the Project name field) to advance to the Browse Requirement Types dialog box. From the Project field, select the external project with which the Test Manager project is to be synchronized. The requirement types that are available with the selected project are automatically populated into the Requirement types field. Select the requirement types that are to be synchronized (hold down the Ctrl key to select multiple requirement types) and click OK.

8 9

Your selections are now displayed on the Edit Configuration dialog box. Click OK. Back on the Edit Configuration dialog box, check the Enable creation of unassigned requirements check box to enable creation and editing of unmapped requirements in Test Manager projects that are configured for integration with DOORS. Check the Enable upload of requirements to Telelogic DOORS check box to enable the upload of unmapped/unassigned requirements from Test Manager to DOORS. This allows you to upload additional previously unmapped requirement trees to DOORS and then have those requirements mapped within Test Manager. When this option is enabled, the Map Requirement button becomes enabled (Requirements 364

Properties), enabling configuration of top level requirements for external requirement types, which is required when uploading unmapped requirements. Click OK to save the configuration data to the database. Warning: As the DOORS application object is used for communication (and this object does not support login data, but rather requires a running DOORS client), Test Manager starts each DOORS client process with the provided login data and then uses that same data for all subsequent application objects. Therefore only one set of DOORS login credentials is supported for communication at one time. It is recommended that you use the same DOORS credentials for all configurations so that integration tasks can be performed on the front-end server for all projects at the same time. When a second set of credentials is used, the second set only works after all sessions using of the first set of credentials have timed out.

Related Concepts External Requirements Management Tools Requirements Management Related Procedures Managing Requirements - Quick Start Task Working with External Properties Managing Requirements Customizing Requirement Properties Related Reference Requirement Properties tab

365

Working with CaliberRM


This section explains how to work with CaliberRM's integration with Test Manager. In This Section Copying CaliberRM-Integrated Projects How to manage CaliberRM baselines when copying Test Manager projects.

366

Copying CaliberRM-Integrated Projects


To manage CaliberRM baselines when copying Test Manager projects.
1

Navigate to Settings Note:

Integrations Configuration and verify that the baseline you want to save is selected.

If the correct baseline is not selected, click Edit Configuration. The Edit Configuration dialog box displays. Click Browse next to the Project name field. On the Browse Projects dialog box, select the baseline you want to save, then confirm your selection. When a baseline is changed, before an associated Test Manager project can be copied, a synchronization must be performed to update the project requirements with the baseline changes. The integration configuration is only copied if a baseline other than the current baseline is selected. If the current baseline is selected, the user is prompted to specify if they want to keep the integration configuration in the original project or move to the copied project. Projects and click Copy Project in the Actions column of the project you want

Note:

Note:

Navigate to Administration to copy. Note:

This is a SilkCentral Administration Module task. See SilkCentral Administration Module documentation for full details regarding copying projects.

3 4

The Copy Project dialog box displays. Select the items you want to copy into the new project, then confirm your selection. After the project has successfully been copied, apply the baseline you want to continue working with to the project you are working on. Note: It does not matter if you continue working with the original project or a copy of the project. After copying the project, both the original and the copy are identical. By applying the correct baseline you determine which project you are working on.

Related Concepts Baseline Support for CaliberRM Integration External Requirements Management Tools Requirements Management Related Procedures Managing Requirements - Quick Start Task Working with External Properties Managing Requirements Customizing Requirement Properties Related Reference Requirement Properties tab

367

Working with External Properties


This section explains how to work with external properties in Test Manager. In This Section Editing External Properties Describes how to edit external properties. Viewing External Properties Describes how to view external properties.

368

Editing External Properties


To edit external properties
1 2 3 4

Click Requirements on the workflow bar. Select the requirement for which you intend to edit external properties. Select the Properties tab. Click Edit External Properties to display the Edit External Properties dialog box. All properties of the external requirement are displayed here. Edit all properties as required. Note: Editable properties on this dialog box offer input fields and controls with which you can edit the properties. If a mapping rule exists for an attribute, the attribute will be tagged with a trailing asterisk (*).

Click OK to save your changes and close the dialog box.

Related Concepts External Requirements Management Tools Requirements Management Related Procedures Managing Requirements - Quick Start Task Working with External Properties Managing Requirements Customizing Requirement Properties Related Reference Requirement Properties tab

369

Viewing External Properties


To view external properties
1 2 3 4 5

Click Requirements on the workflow bar. Select the requirement for which you intend to edit external properties. Select the Properties tab. Click View External Properties to display the View External Properties dialog box. All properties of the external requirement are displayed. Close the dialog box.

Related Concepts External Requirements Management Tools Requirements Management Related Procedures Managing Requirements - Quick Start Task Working with External Properties Managing Requirements Customizing Requirement Properties Related Reference Requirement Properties tab

370

Deleting Property-Mapping Value Pairs


To delete a property-mapping value pair
1 2 3 4 5

From the project for which you are deleting property mapping, select the Settings link on the menu tree. Click Edit Property Mapping for the configured external tool. Select the property-mapping value pair in the Custom property mapping select box. Click Remove Mapping. Click OK on the Edit Property Mapping dialog box to save your changes.

Related Concepts Requirements Management Related Procedures Managing Requirements - Quick Start Task Working with External Properties Managing Requirements Related Reference Requirement Properties tab

371

Disabling Requirements-Management Integration


To disable requirements-management integration configuration
1 2

Select the project, and then select the Settings link on the menu tree. Click the Disable Configuration button of the requirements-management tool for which you want to disable integration. All integration data and functionality is then disabled, but not deleted from the project.

Related Concepts External Requirements Management Tools Requirements Management Related Procedures Managing Requirements - Quick Start Task Working with External Properties Managing Requirements Customizing Requirement Properties Related Reference Requirement Properties tab

372

Editing Property Mapping


Property mapping functionality allows you to map property fields between Test Manager and an external requirements-management tool. For example, a custom requirement property in Test Manager called User might be equivalent to a custom property in CaliberRM called User_ID). The property mapping feature ensures that requirement-property fields are accurately populated between projects during requirement uploading and importing.

To edit property mapping


1 2 3 4 5 6 7 8

Select the project, and then select the Settings link on the menu tree. Click Edit Property Mapping for the configured external tool. Select an external requirement type from the Requirement types list. All custom requirements of that type are then displayed below in the selection box. Select the custom requirement property for which you are establishing mapping. From the list box on the right, select the Test Manager custom property to establish mapping to the external custom property you have selected. Click Add Mapping to map the requirements. The results are displayed in the Custom property mapping box. The System property mapping box displays the two pre-configured mappings for requirement name and description, which cannot be removed. Click OK on the Edit Property Mapping dialog box to save your changes.

Related Concepts Requirements Management Related Procedures Managing Requirements - Quick Start Task Working with External Properties Managing Requirements Customizing Requirement Properties Related Reference Requirement Properties tab

373

Removing Requirements-Management Integration


To remove requirements-management configuration
1 2 3

Select the project, and then select the Settings link on the menu tree. Click Remove Configuration of the requirements-management tool for which you want to remove integration (this button is only enabled if the integration configuration has been disabled). Click Yes on the Remove External Integration dialog box to delete the configuration. All related data is then deleted from the database.

Related Concepts External Requirements Management Tools Requirements Management Related Procedures Managing Requirements - Quick Start Task Working with External Properties Managing Requirements Customizing Requirement Properties Related Reference Requirement Properties tab

374

Synchronizing Requirements Across Tools


To synchronize requirements between Test Manager and an externally configured requirements management tool
1 2 3 4 5 6

Click Requirements on the workflow bar. Select the Project node in the Requirements tree view. Select the Properties tab. Click Synchronize Requirements. Click Yes on the Synchronize Requirements confirmation dialog box to begin synchronization. A dialog box opens when synchronization is complete, displaying synchronization statistics, including the number of requirements that have been created, updated, and deleted. Click OK to complete the synchronization. Any updates that were made to mapped requirements in your externally configured requirements management tool are now reflected in the Requirements tree in Test Manager.

Automatic synchronization of requirements between Test Manager and external requirements management tools can be configured to occur based on global schedules.

To synchronize requirements based on globally defined schedules


1 2 3 4 5

Click Settings on the workflow bar. Select the Integrations Configuration tab. Click Edit Schedule. The Edit Schedule dialog box opens. Click the Global option button. Select a pre-defined global schedule from the selection list. Note: See SilkCentral Administration Module documentation for details about configuring global schedules.

Click OK.

Notification settings can be defined to alert users through email when errors occur during automated synchronization of requirements between Test Manager and external requirements management tools. Notification recipients receive copies of synchronization log files.

To define e-mail notification settings for automatic synchronization events


1 2 3 4 5

Click Settings on the workflow bar. Select the Integrations Configuration tab. Click Edit Notification. The Edit Notification dialog box displays. Check the Enable notification check box. Select a user name from the Username list. Note: See SilkCentral Administration Module documentation for details about defining users.

6 7

If required, add additional e-mail addresses for other recipients in the Other email addresses text-entry box. Use semicolons to separate multiple e-mail addresses. Click OK.

375

Related Concepts Synchronizing Requirements Requirements Management Related Procedures Managing Requirements - Quick Start Task Working with External Properties Managing Requirements Customizing Requirement Properties Related Reference Requirement Properties tab

376

Collapsing or Expanding the Requirements tree


You can consolidate levels of the hierarchy or display all levels of the hierarchy based on your viewing needs.

To collapse or expand the Requirements tree


1 2

Click Requirements on the workflow bar. Right-click a requirement folder within the Requirements tree and select a collapse or expand option.

Related Concepts Requirements Management Related Procedures Managing Requirements - Quick Start Task Requirements Tree Managing Requirements Related Reference Requirements Unit Interface

377

Switching Between Full and Direct Coverage Modes


To switch between full and direct coverage modes
1 2 3

Click Requirements on the workflow bar. Click Full/Direct Coverage on the toolbar to switch to the alternative view. Click Full/Direct Coverage again to return to the previous view.

Related Concepts Requirements Management Related Procedures Managing Requirements - Quick Start Task Full Coverage and Direct Coverage Modes Managing Requirements Related Reference Requirements Unit Interface

378

Managing Test Plans


This section explains how to manage test plans in Test Manager. In This Section Associating Requirements with Test Definitions This section explains how to assign requirements to test definitions in Test Manager. Configuring Test Definition Attributes This section explains how to configure test definition attributes in Test Manager. Configuring Test Definition Parameters This section explains how to configure test definition parameters in Test Manager. Creating Test Definitions This section explains how to create test definitions in Test Manager. Creating Test Plans This section explains how to create test plans in Test Manager. Editing Test Plan Elements This section explains how to edit test plan elements in Test Manager. Working with Attachments This section explains how to work with attachments in Test Manager. Working with Data-Driven Tests This section explains how to work with data-driven tests in Test Manager. Working with Manual Tests This section explains how to work with manual tests inTest Manager. Working With Test Definitions in Grid View This section explains how to work with test definitions in Test Manager Grid View. Creating a Filter for a Folder or Container Describes how to create a filter for a folder or container. Expanding/Collapsing the Test Plan tree Describes how to expand and collapse the Test Plan tree. Tracking Test Plan History Describes how to view a test plan element's history. Updating Execution Definitions Describes how to display current execution-definition content and the latest test-plan filter results. Using Upload Manager Describes how to use Test Manager's Upload Manager. Viewing Assigned Executions Describes how to view assigned executions. Viewing Recent Changes Describes how to view recent changes to requirements or test definitions.

379

Associating Requirements with Test Definitions


This section explains how to assign requirements to test definitions in Test Manager. In This Section Assigning Requirements to Test Definitions Describes how to manually assign requirements to test definitions. Locating Assigned Requirements Describes how to locate assigned requirements in the Available Requirements tree. Removing Requirement Assignments Describes how to remove a requirement assignment. Sorting Requirements Describes how to sort requirements on the Assigned Requirements tab.

380

Assigning Requirements to Test Definitions


To manually assign requirements to test definitions
1 2 3

Click Test Plan on the workflow bar. Select the test definition to which you are assigning requirements. In Test Plan View, select the Assigned Requirements tab. All requirements that are available for assignment are displayed in the Available Requirements window. Note: The Available Requirements window can be broadened or narrowed by dragging the window splitter (the left-hand edge of the window) to the left or right.

Click the arrow of any requirement to assign it to the currently selected test definition. Note: Newly generated test definitions can automatically be assigned to the requirements from which they are generated by selecting the Assign newly generated test definitions to requirements on the Generate Test Plans from Requirements dialog box (the default behavior).

Related Concepts Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Working with Data-Driven Tests Managing Test Plans Generating Test Plans from Requirements View Related Reference Test Plan Assigned Requirements tab

381

Locating Assigned Requirements


To locate assigned requirements in the Available Requirements tree
1 2 3 4 5

Click Test Plan on the workflow bar. Select a test definition. Select the Assigned Requirements tab. In the Actions column of a requirement, click the requirement is stored in. to find out in which node in the Available Requirements tree

The corresponding parent-requirement node is then expanded and the assigned requirement is highlighted.

Related Concepts Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Generating Test Plans from Requirements View Related Reference Test Plan Assigned Requirements tab

382

Removing Requirement Assignments


To remove a requirement assignment
1 2 3 4 5

Click Test Plan on the workflow bar. Select a test definition (in the Test Plan tree) that has at least one requirement assigned to it. Select the Assigned Requirements tab. In the Actions column, click the delete button of the assigned requirement. Click Yes on the confirmation dialog box to confirm deletion of the assignment. Note: To remove all requirement assignments from the selected test definition, click Remove All.

Related Concepts Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Generating Test Plans from Requirements View Related Reference Test Plan Assigned Requirements tab

383

Sorting Requirements
To sort requirements on the Assigned Requirements tab
1

Click the column header of the property by which you want to sort the requirements. A small upward or downward pointing arrow indicates both which column the sort is based and the direction of the sort (ascending or descending). If required, click the column header again to reverse the direction of the sort.

Related Concepts Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Generating Test Plans from Requirements View Related Reference Test Plan Assigned Requirements tab

384

Configuring Test Definition Attributes


This section explains how to configure test definition attributes in Test Manager. In This Section Assigning Attributes to Test Definitions Describes how to assign an attribute to a test definition. Deleting Attributes from Test Definitions Describes how to delete an attribute from a test definition. Editing Test Definition Attributes Describes how to edit a test definition attribute.

385

Assigning Attributes to Test Definitions


To assign an attribute to a test definition
1 2 3 4 5 6 7

Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select the test definition to which you are assigning an attribute. Select the Attributes tab. Click Add Attribute to display the Add Attributes dialog box. Click the plus symbol (+) of the attribute that you are assigning. Based on the attribute type you have selected (set or normal) you will be presented with an Edit Attribute dialog box, which allows you to specify which of the available attribute values youd like to assign to the test definition. Select the value required and click OK to assign the attribute. Note: A Set type attribute allows you to assign a set of values to an attribute. A Normal type attribute allows you to assign only a single value.

Related Concepts Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Generating Test Plans from Requirements View Related Reference Test Plan Attributes tab

386

Deleting Attributes from Test Definitions


To delete an attribute from a test definition
1 2 3 4 5 6

Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select the test definition for which you wish to delete an assigned attribute. Select the Attributes tab. Click the delete icon of the attribute you are deleting. The Delete Attribute confirmation dialog box displays. Click Yes to delete the attribute. Note: Inherited attributes cannot be deleted.

Related Concepts Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Generating Test Plans from Requirements View Related Reference Test Plan Attributes tab

387

Editing Test Definition Attributes


To edit a test definition attribute
1 2 3 4 5 6

Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select the test definition for which you are editing an assigned attribute. Select the Attributes tab. Click the Edit Attribute button of the attribute you are editing. The Edit Attribute dialog box displays (options available on the Edit Attribute dialog box vary depending on the attribute type that you have selected). Select the required value and click OK to save your settings.

Related Concepts Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Generating Test Plans from Requirements View Related Reference Test Plan Attributes tab

388

Configuring Test Definition Parameters


This section explains how to configure test definition parameters in Test Manager. In This Section Editing Predefined Parameters Describes how to edit predefined parameters. Adding Predefined Parameters to Test Definitions Describes how to add a predefined parameter to a test definition. Clearing Predefined Parameter Assignments Describes how to clear a predefined parameter assignment. Configuring SilkTest Plan Properties Configure SilkTest plan properties when creating a new SilkTest plan test definition. Configuring .Net Explorer Test Properties Configure .Net Explorer test properties when creating a new .Net Explorer test definition. Configuring JUnit Test Properties Configure JUnit test properties when creating a new JUnit test definition. Configuring Manual Test Properties Configure manual test properties when creating a new manual test definition. Configuring NUnit Test Properties Configure NUnit test properties when creating a new NUnit test definition. Configuring SilkPerformer Test Properties Configure SilkPerformer test properties when creating a new SilkPerformer test definition. Configuring SilkTest Test Properties Configure SilkTest test properties when creating a new SilkTest test definition. Configuring Windows Scripting Test Properties configure Windows scripting test properties to create a new Windows scripting test definition. Creating Custom Parameters Describes how to create a custom parameter.

389

Editing Predefined Parameters


To edit a predefined parameter
1 2 3 4 5 6

Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select the test definition node for which you are editing an existing parameter. Select the Parameters tab. In the parameter you want to edit, click Edit. The Add Custom Parameter dialog box displays. Edit the parameter values as required. Note: Inherited parameters cannot be edited. Uncheck the Inherit from parent check box to enable editing of the parameters Value setting. Parameter Name and Type settings cannot be edited.

Related Concepts Test Definition Parameters Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Generating Test Plans from Requirements View Related Reference Test Plan Parameters tab

390

Adding Predefined Parameters to Test Definitions


To add a predefined parameter to a test definition
1 2 3 4 5

Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select the test definition node for which you are adding a predefined parameter. Select the Parameters tab. Click Add Predefined Parameter to display the Add Predefined Parameter dialog box, which lists all of the project attributes that are available in the project file. Note: The Add Predefined Parameter button is only available for SilkPerformer test definitions for which the Project property has already been defined.

6 7 8

To add any of the listed parameters, click the corresponding add icon. On the dialog box that displays, specify the actual value for the parameter. Click Save to add the parameter to the active Test Plan tree node.

Related Concepts Test Definition Parameters Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Generating Test Plans from Requirements View Related Reference Test Plan Parameters tab

391

Clearing Predefined Parameter Assignments


To clear a predefined parameter assignment
1 2 3 4 5

Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select the test definition node for which you are clearing the assignment of an existing parameter. Select the Parameters tab. Click the clear button that corresponds to the parameter that is being cleared. Note: Inherited parameters cannot be cleared. Uncheck the Inherit from parent check box on the Set Parameter dialog box to enable clearing of a parameter.

Click Yes on the Clear Parameter dialog box to clear the parameter.

Related Concepts Test Definition Parameters Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Generating Test Plans from Requirements View Related Reference Test Plan Parameters tab

392

Configuring SilkTest Plan Properties


To configure SilkTest plan properties, you must first follow the steps described in Creating Test Definitions.

To configure SilkTest plan properties


1

On the Test Definition dialog box, select SilkTest plan from the Type list box and then click Next. The SilkTest Plan Properties dialog box opens. In the Plan File text box, type the fully qualified name of the test plan file to be executed. Click Browse to browse for the file. In the SilkTest Project File text box, type the name of the SilkTest containing the file and environmental settings. Click Browse to browse for the project file. In the Option Set text box, type the fully qualified name of the option set file containing environmental settings. Click Browse to browse for the option set file. In the Data file for attributes and queries text box, type the default path of the test plan initialization file. Click Browse to browse for the test plan initialization file. In the Test plan query name text box, type the fully qualified name of the saved test plan query. Click Finish.

2 3 4 5 6 7

Related Concepts Test Definition Parameters Related Procedures Creating Test Definitions Editing Test Definitions

393

Configuring .Net Explorer Test Properties


To configure .NET Explorer test properties
1

On the Test Definition dialog box, select .NET Explorer Test from the Type list box and then click Next. The .NET Explorer Test Properties dialog box opens. Browse to and select the .NET Explorer script to apply to the test definition (.nef file). Browse to and select the executable that executes the selected script file (NetExplorer.exe), such as C: \Program Files\MyCustomSPFolder\DotNET Explorer\NetExplorer.exe. In the Test case text box, type the name of the .NET Explorer script to execute. If this text box is left blank, all test cases within the script are executed. Note: The test cases InitTestCase and EndTestCase are always executed.

2 3 4

Click Finish.

Related Concepts Test Definition Parameters Related Procedures Creating Test Definitions Editing Test Definitions

394

Configuring JUnit Test Properties


To configure JUnit test properties, you must first follow the steps described in Creating Test Definitions.

To configure JUnit test properties


1

On the Test Definition dialog box, select JUnit Test from the Type list box and then click Next. The JUnit Test Properties dialog box opens. In the Test class text box, type the fully qualified name of the JUnit test class. In the Test method text box, type the name of the appropriate test method. The method must be available in the test class. If the Test method text box is left blank, all tests that are included in the suite will be executed.

2 3

Set the Java home directory to the installation path of the Java Runtime Environment (JRE). The path must be valid on the execution server on which the test definition runs. Specify a valid Java Classpath to use on the execution server. Borland recommends to use a relative classpath. The relative classpath is then expanded to the full classpath on the execution server. By using a relative classpath, changes on the location of the source control profile do not require additional changes to the classpath. The following example shows usage of the relative classpath: The relative classpath must point to the root node of the test container containing the JUnit test definition, for example JUnit_tests. The relative classpath on the execution server is then expanded to include the source control profile's working folder, for example C:\temp, and the test file names, for example JUnit4Test.jar. The relative classpath to junit.jar must also be added to the classpath, with the appropriate JUnit version, as the following example shows: The specified relative paths junit-4.4.jar;JUnit4Test.jar are expanded to C:\temp\JUnit_tests \junit-4.4.jar;C:\temp\JUnit_tests\JUnit4Test.jar on the execution server. You can also use a fully qualified classpath. The fully qualified classpath must point to the archive or folder in which the test classes reside. Further, junit.jar must be added to the classpath, with the appropriate JUnit version, as the following examples show:

C:\Java\junit3.8.1\junit.jar;C:\MyApps\main.jar;C:\MyApps\utils.jar ${junit_home}\junit.jar;${apps_home}\main.jar;${apps_home}\utils.jar
6

In the Coverage path text box, type the JAR libraries or the specific class files to monitor for code coverage information. Borland recommends using the relative coverage path from the test container root node, which is then expanded on the execution server. You can also use a fully qualified path. Use semicolons to separate multiple jar files, as the following examples show:

C:\MyApps\main.jar;C:\MyApps\utils.jar ${apps_home}\main.jar;${apps_home}\utils.jar
Note:
7

The coverage path setting is disregarded if the Record external AUT Coverage check box is checked.

Check the Record external AUT Coverage check box to get code coverage for the application under test that is defined for the execution definition in the Code Analysis Settings portion of Execution Deployment . If the check box is not checked, code coverage is recorded from the executing virtual machine. The check box is by default not checked.

395

Click Finish.

Note: Parameters are passed to the Java process as system properties, for example Dhost_under_test=10.5.2.133. Use the System.getProperty() method to access the system properties. For example, to access the previously passed host_under_test property, use System.getProperty(host_under_test);. Related Concepts Test Definition Parameters Related Procedures Creating Test Definitions Editing Test Definitions

396

Configuring Manual Test Properties


To configure manual test properties, you must first follow the steps described in Creating Test Definitions.

To configure a manual test property


1 2

On the Test Definition dialog box, select Manual Test from the Type list box. In the Planned time text box, type the expected amount of time for this manual step to execute and then click Next. The Add Manual Test Definition Step dialog box displays. Note: Manual test steps are automatically timed in seconds from the moment you begin execution. These values are available in Detail view, not Step-by-Step view.

Specify a name, an action description, and the expected results for the first step of the manual test. Note: Test Manager supports HTML formatting as well as the cutting and pasting of HTML content for text boxes.

4 5

Click OK. Optional: Click New Step to add additional steps to your manual test.

Related Concepts Manual Tests Manual Test Definitions Test Definition Parameters Related Procedures Working with Manual Tests Creating Test Definitions Editing Test Definitions

397

Configuring NUnit Test Properties


To configure NUnit test properties, you must first follow the steps described in Creating Test Definitions. Note: It is recommended that you add the .\bin folder of your NUnit installation to the system path. Click Start Control Panel System Advanced Environment Variables to add a path like C:\Program Files \NUnit 2.2\bin to the system environment variable PATH.

To configure NUnit test properties


1

On the Test Definition dialog box, select NUnit Test from the Type list box and then click Next. The NUnit Properties dialog box displays. Click Browse to locate and select the NUnit assembly from which you want to pull a test definition. Type the working directory in the NUnit Directory text box. This directory is the local path to the file nunit-console.exe, such as C:\Program Files\NUnit 2.2 \bin. Note: If only one version of NUnit is installed on your computer, you can leave the NUnit Directory text box blank. If multiple versions are installed, you must provide a valid path.

2 3

Click Finish.

Related Concepts Test Definition Parameters Related Procedures Creating Test Definitions Editing Test Definitions

398

Configuring SilkPerformer Test Properties


To configure SilkPerformer test properties, you must first follow the steps described in Creating Test Definitions.

To configure SilkPerformer test properties


1

On the Test Definition dialog box, select SilkPerformer Test from the Type list box and then click Next. The SilkPerformer Test Properties - Select Project dialog box opens. Perform one of the following steps to define the SilkPerformer project from which your test case is taken:

Click Browse to select a SilkPerformer project that has been saved to your local file system. Click Import to import a SilkPerformer project that is saved to the file pool. A file in the file pool can be
used anywhere in the Test Plan tree. On the Import Project From File Pool dialog box, either select a saved SilkPerformer project package (.ltz) from the File pool entry list box or click Browse to select a SilkPerformer project package that has been saved to the source-control system. If you check the Remove file from file pool check box before you click Finish, the selected SilkPerformer file is deleted from the file pool. Perform one of the following steps to upload SilkPerformer projects to the file pool:

From the Administration module, click Upload and browse to the appropriate LTZ file. For more
information, refer to the SilkCentral Administration Module documentation.

Use the upload mechanism offered by SilkPerformer. Use the Upload Manager in SilkCentral. Use an existing project directory by way of a UNC path. (Create a new test definition, click Browse to
select the appropriate LTP file, and then select a workload.)
3 4 5

On the SilkPerformer Test Properties - Select Project dialog box, click Next. On the SilkPerformer Test Properties - Select Workload dialog box, select one of the workload profiles that has been defined for the project from the Workload list box. Click Finish to create the test case. Test Manager is fully integrated with SilkPerformer.

Related Concepts Working With SilkPerformer Projects Test Definition Parameters Related Procedures Creating Test Definitions Editing Test Definitions

399

Configuring SilkTest Test Properties


To configure SilkTest test properties, you must first follow the steps described in Creating Test Definitions.

To configure SilkTest test properties


1

On the SilkTest Test Properties - Select Test Script dialog box, click Browse and select the test script file from either the defined SilkTest project or the source control directory. Express the source control directory as a relative path to the root node defined in the test container. Click Next. The SilkTest Test Properties - Select Testcase dialog box opens. Note: If the SilkTest script is a data-driven .g.t file, for example SilkTestScript1.g.t, then data sources are completely controlled within the script file and not through Test Manager's data-driven properties. The Data-driven check box is checked by default when you use a data-driven script file. For more information about data-driven SilkTest tests, refer to the SilkTest documentation.

3 4 5

Select a test case from the available test cases in the defined script file or specify a custom test case. If required, specify an option set file. Click Finish to create the SilkTest test definition. Note: If you possess SilkTest test cases that require more than one hour to complete, adjust Test Manager's time-out settings. Otherwise, Test Manager assumes an error has occurred and terminates the execution. For details about time-out settings, refer to the SilkCentral Administration Module documentation. You can use the Test Properties - Select Test Script dialog box to import multiple test cases. To access the Test Properties - Select Test Script dialog box from the Test Definition dialog box, select SilkTest Multi-testcase import from the Type list box and click Next. Follow the steps described above to complete the task.

Related Concepts SilkTest Test Definitions Test Definition Parameters Related Procedures Creating Test Definitions Editing Test Definitions

400

Configuring Windows Scripting Test Properties


To configure Windows scripting test properties, you must first follow the steps described in Creating Test Definitions. When you select Windows Scripting Test from the Type list box on the Test Definition dialog box, and then click Next, you are taken to the Windows Scripting Properties dialog box.

To configure Windows scripting test properties


1 2

From the Windows Scripting Properties dialog box, click Browse and select a Windows scripting test script. Specify the location of any required additional parameters in the Switches field. Note: You may add other switches to be passed to the script. For more details on the switches that can be used, see the Windows Script Host Tests topic and consult MS Scripting Host documentation.

Click Finish.

Related Concepts Windows Script Host Tests Test Definition Parameters Related Procedures Creating Test Definitions Editing Test Definitions

401

Creating Custom Parameters


To create a custom parameter
1 2 3 4 5 6 7 8

ClickTest Plan on the workflow bar. Click Test Plan View on the toolbar. Select the test definition node for which you are creating a new parameter. Select the Parameters tab. Click Add Custom Parameter to display the Add Custom Parameter dialog box. Provide a name for the parameter. Select the parameter type (String, Number, Float, Boolean, Password, or Character). Define the parameter value that is to be assigned to the selected test definition. Note: Values for parameters of type String must be set in quotation marks () if you want to use the parameter in SilkTest executions.

Click OK. The parameter now displays in the Parameters list for the selected node. Note: Parameters are automatically assigned to all sub-folders and child test definitions of the nodes to which theyve been assigned.

Related Concepts Test Definition Parameters Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Generating Test Plans from Requirements View Related Reference Test Plan Parameters tab

402

Creating Test Definitions


This section explains how to create test definitions in Test Manager. In This Section Creating a Test Package Create a test package out of a third-party test definition. Creating Test Definitions Create a test definition to manage your automated and manual tests. Editing Test Definitions Edit a test definition when its properties require changing. Executing a Trial Run of a Test Definition Describes how to run a trial run of a test definition.

403

Creating a Test Package


To create a new test package out of a third-party test definition
1 2

Run the test definition once to create the output.xml file, which contains the structure of the test package. In the Test Plan tree, right-click the name of the test definition and choose Convert to Test Package. The selected test definition is converted to a hierarchy representing the structure of the last execution result.

Related Concepts Usage of External IDs

404

Creating Test Definitions


To create a new test definition
1 2 3

Click Test Plan on the workflow bar. Select a container or folder node in the Test Plan tree where you want to insert a new test definition. Click New Test Definition on the toolbar or right-click within the tree and choose New Test Definition. A new test definition node is appended to the tree view, and the Test Definition dialog box opens. Specify a name and meaningful description for the test definition. Note: Test Manager supports HTML formatting as well as the cutting and pasting of HTML content for text boxes.

Select one of the following test definitions from the Type list box:

SilkTest test SilkPerformer test Manual test SilkTest Multi-testcase import NUnit test Windows scripting test JUnit test SilkTest plan
6

Click Next and proceed to the appropriate topic, as follows:

If you are configuring a SilkTest test, proceed to Configuring a SilkTest Test. If you are configuring a SilkPerformer test, proceed to Configuring a SilkPerformer Test. If you are configuring a manual test, proceed to Configuring a Manual Test. If you are configuring a SilkTest multi-testcase import, proceed to Configuring SilkTest Multi-Testcase
Import.

If you are configuring a NUnit test, proceed to Configuring an NUnit Test. If you are configuring a Windows scripting test, proceed to Configuring a Windows Scripting Test. If you are configuring a JUnit test, proceed to Configuring a JUnit Test. If you are configuring a SilkTest plan test, proceed to Configuring a SilkTest plan Test. If you are configuring a .NET Explorer test, proceed to Configuring a .NET Explorer Test.
Note: Test Manager's well-defined public API allows you to implement a proprietary solution that meets your automated test needs. Test Manager is open and extensible to any external tool that can be invoked from a Java implementation or through a command-line call.

405

Note:

Throughout the test-definition configuration process and across all test definition types, Inherit from parent check box options are provided where applicable, enabling you to accept settings of any existing parent entity.

Related Concepts Upload Manager Test Plan Management Test Definition Parameters Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Configuring SilkTest Test Properties Configuring SilkPerformer Test Properties Configuring Manual Test Properties Configuring JUnit Test Properties Configuring SilkTest Plan Properties Configuring NUnit Test Properties Configuring Windows Scripting Test Properties Configuring .Net Explorer Test Properties Editing Test Definitions Related Reference Test Plan Unit Interface APIs HTML Support for Description Text Boxes

406

Editing Test Definitions


To edit a test definition
1 2

Click Test Plan on the workflow bar. Select the test definition or the test package that you want to edit. Note: Test Manager supports HTML formatting as well as the cutting and pasting of HTML content for text boxes.

Click Edit on the toolbar or under the General Properties section in the tab view. The Edit Test Definition dialog box displays. Specify the name and description of the selected test definition. If the selected test definition is a test package, the Update Package Structure on Result check box is available. Check the Update Package Structure on Result check box if you want to update the structure of the test package according to the results of the test execution run.

Configure the properties of the test definition or the test package according to the test definition type.

Related Concepts Upload Manager Test Plan Management Test Definition Parameters Test Packages Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Configuring Test Definition Parameters Related Reference Test Plan Unit Interface APIs HTML Support for Description Text Boxes

407

Executing a Trial Run of a Test Definition


After creating a test definition, you can perform a trial run to check if the newly created test definition works as intended.

To perform a trial run of a test definition


1 2 3 4

Click Test Plan on the workflow bar. Right-click a test definition that you want to try out in the Test Plan tree. Select Try Run Test Definition . The Go To Activities dialog box displays. Click Yes if you want to view the Activities page (see also Activities Overview), or click No if you want to remain on the current Web page. Note: Check the Don't show this dialog again (during this login session) check box if you dont want to be asked about switching to the Activities page again in the future. Note that this setting will be discarded when you log out of Test Manager. The test definition is executed as soon as you perform step 1. You can analyze the results on the Activities page (Test Manager/Projects/Activities). See Activities Overview for detailed information about the Activities page.

Note:

Related Concepts Manual Tests Manual Test Definitions Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Working with Manual Tests Managing Test Plans Related Reference Test Plan Unit Interface

408

Creating Test Plans


This section explains how to create test plans in Test Manager. In This Section Importing SilkTest Test Plans Describes how to import a SilkTest test plan.

409

Importing SilkTest Test Plans


To upload a test plan from SilkTest
1 2 3 4 5 6

Create a test plan in SilkTest. See SilkTest documentation for details. From the SilkTest Testplan menu, select Upload to Test Manager . The SilkCentral Administration Module Login screen displays. Enter your user name and password. From the Project list box on the Upload Testplan file to Test Manager dialog box, select the Test Manager project to which you are uploading the file. Click OK. Click OK on the Upload Testplan Complete confirmation dialog box. Open the Test Manager Test Plan unit. You will see the uploaded project listed as a test container in the Test Plan tree with the same name as the imported SilkTest test plan. Note: To work with the new test container, you may have to edit source control profile settings or other settings.

7 8

To edit the test container, select the container in the Test Plan tree. Click Edit on the Properties tab to open the Edit Test Container dialog box. Edit the criteria for the test container as required. Note: You can find the inherited SilkTest symbols on the Parameters tab in the Test Plan View. Inherited test definition attributes can be found on the Attributes tab in both Test Plan View and in the Settings module. Inherited SilkTest queries can be found on the Filters tab in the Settings unit.

Related Concepts SilkTest Test Plans Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Generating Test Plans from Requirements View Related Reference Test Plan Unit Interface

410

Editing Test Plan Elements


This section explains how to edit test plan elements in Test Manager. In This Section Adding Links to Containers Describes how to add a link to a container. Adding Test Containers Describes how to add a test container. Adding Test Folders Describes how to add a new test folder. Copying, Pasting, and Deleting Test Plan Elements Describes how to copy, paste, and delete test plan elements. Editing SilkTest Tests How to edit the properties of aSilkTest test. Editing SilkPerformer Tests How to edit the properties of a SilkPerformer test. Editing JUnit Tests Describes how to edit the properties of a JUnit test. Editing NUnit Tests Describes how to edit the properties of an NUnit test. Editing Success Conditions Describes how to edit success conditions. Editing Windows Scripting Host Tests Describes how to edit a Windows Scripting Host test. Finding and Replacing Test Definition Properties Describes how to find and optionally replace specified test definition property values. Modifying Test Containers Describes how to modify test container properties. Modifying Test Folders Describes how to modify a test folder. Set a Test Plan Node as Integration Default for External Agile Planning Tools Describes how to set a test plan node as the integration default node for the creation of tests through an external agile planning tool.

411

Adding Links to Containers


The New Link feature enables you to add nodes to the Test Plan tree view that directly reference other test containers in the same project. Linked test containers are visible at the position where links are inserted. Linked test containers are displayed in read-only mode.

To link a test container to the Test Plan tree


1 2 3 4

Click Test Plan on the workflow bar. Right-click a node in the Test Plan tree menu where you want to have a linked test container appear. Choose New Link if you want to link a test container at the hierarchy level of the selected node, or choose New Child Link to link a test container a hierarchy level below the selected node. The Select Test Container For Linking dialog box displays where you can select the test container you want to link to the selected test container. Click OK to confirm your selection. Note: If the target test container and the container to link have different Source Control values, a confirmation dialog box displays, asking you if you really want to create the link. Linking a test container with differing source control values can lead to problems when downloading or executing a test definition within the linked container. Click No if you want to change the Custom include directory of the target or of the linked container first, or Click Yes to create the link anyway.

The linked container will be placed within the selected container as read-only entity. Any changes to the original test container will be reflected in the linked container.

Related Concepts Test Plan Tree Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Related Reference Test Plan Toolbar Functions

412

Adding Test Containers


To add a new test container
1

Click Test Plan on the workflow bar. Click New Test Container on the toolbar (or right-click within the tree menu and choose New Test Container). A new container root node will be appended to the tree menu and the New Test Container dialog box will display.

Define a Name (or accept the default name) and a meaningful Description for the container. Note: Test Manager supports HTML formatting and cutting and pasting of HTML content for Description fields.

3 4

Select any pre-defined Product that is to be associated with this test container from the list box. See SilkCentral Administration Module Help for details regarding adding product profiles. To configure source control settings for this container, select a pre-defined source-control profile from the Source Control profile list box. Note: Defining source control profiles allows you to define where Test Managers execution servers should retrieve program sources for test execution. See SilkCentral Administration Module Help for details regarding source control settings.

Check the Clear working folder before each test execution check box to have the source control profile working folder cleared before each test execution is performed (for example, the sources will be checked out before each execution). This check box is not checked by default. Note: If you use an external source control system, consider that using the Clear working folder before each test execution option in conjunction with MS VSS can lead to longer wait times than when used in conjunction with CVS or Subversion.

To specify the default root path where the container is to be saved, click Browse... and navigate to the location. Note: The Custom Data Directory and Custom Include Directory fields facilitate the integration of Test Manager with functionality available with SilkPerformer 7.1 or higher. In SilkPerformer, the Include directory is divided into a System Include directory and a Custom Include directory; the Data directory is divided into a System Data directory and a Custom Data directory. See SilkPerformer documentation for details.

The Hidden Test Properties portion of the dialog box allows you to specify the test property types that are to be displayed on the test containers Properties tab (and the Properties tab of all test folders within the container). These settings do not affect the display of individual test definitions. To adjust hidden test property settings: Click the Edit button associated with the Hidden Test Properties field. On the Hidden Test Properties dialog box, uncheck the check boxes of all test types for which you want to have properties displayed (SilkPerformer, SilkTest, NUnit, Windows Scripting, JUnit, and .NET Explorer). Click OK to save your settings. Check the Use SilkTest interface to launch tests check box to specify that the SilkTest interface be used to open SilkTest in the execution of tests (rather than the command line). Note: This setting supports execution of tests created with Test Manager 8.0 or higher. When aTest Manager installation is updated to version 8.0 or higher, this check box is not checked for existing test containers. When a new test container is created, the check box is checked by default. It is not recommended that you check this option for test definitions created with versions of Test Manager earlier than 8.0.

413

Click Save to save your settings and update the tree view with the new container.

Related Concepts Test Plan Tree Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Related Reference Test Plan Toolbar Functions HTML Support for Description Text Boxes

414

Adding Test Folders


To add a new test folder
1 2 3 4

Click Test Plan on the workflow bar. Select an existing container or folder node in the Test Plan tree menu where you want to insert a new test folder. Click New Test Folder on the toolbar (or right-click within the tree and choose New Test Folder ). A new folder node is appended to the tree view and the New Test Folder dialog box displays. Provide a name and meaningful description for the folder. Note: Test Manager supports HTML formatting and cutting and pasting of HTML content for Description fields.

Click OK to save your settings and update the tree view with the new test container.

Related Concepts Test Plan Tree Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Related Reference Test Plan Toolbar Functions HTML Support for Description Text Boxes

415

Copying, Pasting, and Deleting Test Plan Elements


Through the Test Plan toolbar, Test Manager allows you to easily delete, cut, copy, and paste test-plan elements within the tree view, both within the current project and between projects. These editing functions simplify the process of building and managing your projects test plan. The data types that are copied along with test definitions and test folders are properties, attributes, parameters, and attachments. Assignments, issues, runs, and history are not copied. Tip: Using the Contents tab (Test Plan Contents), you can view, cut, copy, and paste the child elements of any selected test plan element. Standard Windows Explorer style multi-select functionality is supported on the Contents tab. Before you can paste a test plan element into the Contents tab you must explicitly select an element within the tab to gain the application's focus. Note: Containers cannot be copied or pasted.

To edit a test plan element


1 2 3

Click Test Plan on the workflow bar. Select the test plan element (container, folder, or test definition) in the tree view to which the edit is to be applied. Click the appropriate toolbar button: Note: Note that these commands are also available through context menus in the Test Plan tree.

Delete Deletes the selected element from the tree. Cut Cuts the selected element from the tree and moves it to the clipboard. Copy Copies the selected element to the clipboard (containers cannot be copied). Paste Pastes a copy of the element held on the clipboard to the same level of the currently selected
element (containers cannot be pasted).

To copy and paste a folder or test definition between projects


1 2 3 4

Cut or copy the element to the clipboard. Select the destination project through Test Manager/Projects. Select the destination container and/or folder. Click Paste.

You can easily reorder test containers, folders, and test definitions that are listed in the Test Plan tree through and on the toolbar.

To reorder test definitions


1 2

Select the test definition that you want to move up or down in the tree. Click to move the test definition up one step in the Test Plan tree; Click down one step in the Test Plan tree. to move the test definition

416

Note:

This process also applies to changing the order of listed test containers and test folders.

Related Concepts Test Plan Tree Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Related Reference Test Plan Toolbar Functions HTML Support for Description Text Boxes Multi-Select Functionality for Test Plan Elements Test Plan Contents Tab

417

Editing SilkTest Tests


To edit all the properties of a SilkTest test:
1 2 3

Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select a test-container node, test-folder node, or SilkTest test-definition node. Select the Properties tab. In the SilkTest Test Properties area, click Edit (or click Edit on the toolbar) and proceed with the Creating Test Definitions procedure.

Related Concepts SilkTest Test Definitions SilkTest Tests Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Editing Test Definitions Managing Test Plans Related Reference Test Plan Unit Interface

418

Editing SilkPerformer Tests


To edit all the properties of a SilkPerformer test:
1 2 3

Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select a test-container node, test-folder node, or SilkPerformer test-definition node. Select the Properties tab. In the SilkPerformer Test Properties area, click Edit (or click Edit on the toolbar) and proceed with the Creating Test Definitions procedure.

Related Concepts Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Working with SilkPerformer Projects Editing Test Definitions Managing Test Plans Related Reference Test Plan Unit Interface

419

Editing JUnit Tests


To edit all the properties of a JUnit test
1 2 3

Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select a test-container node, test-folder node, or JUnit test-definition node. Select the Properties tab. In the JUnit Test Properties area, click Edit (or click Edit on the toolbar), and proceed with the Creating Test Definitions procedure. Note: Manual test types do not have properties associated with them.

Related Concepts Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Editing Test Definitions Managing Test Plans Creating Test Definitions Related Reference Test Plan Unit Interface

420

Editing NUnit Tests


To edit all the properties of an NUnit test
1 2 3

Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select a test-container node, test-folder node, or NUnit test-definition node. Select the Properties tab. In the NUnit Test Properties area, click Edit (or click Edit on the toolbar) and proceed with the Creating Test Definitions procedure.

Related Concepts Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Editing Test Definitions Managing Test Plans Related Reference Test Plan Unit Interface

421

Editing Success Conditions


When you select the Test Plan tree view node of a specific test-definition type, the success conditions that are associated with the selected test-definition type are automatically displayed at the bottom of the Properties tab in Test Plan View. The Success conditions table tells you the name of each condition; whether or not the condition is active; the conditions Max value; and if the condition is inherited. Note: All success conditions except the execution time-out are disabled and hidden for test package nodes.

To edit the success conditions of a test


1 2 3 4 5 6 7 8 9

Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select a test-container node, test-folder node, or test-definition node. Select the Properties tab. Click Edit to display the Edit Success Conditions dialog box. Uncheck the Inherit from parent check box of any success condition you are editing. Edit values as required. Specify if conditions should be active or inactive by checking or unchecking their Active check boxes. Click OK to save your settings.

Related Concepts Test Plan Tree Test Plan Management Success Conditions Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Related Reference Test Plan Properties tab

422

Editing Windows Scripting Host Tests


To edit all the properties of a WSH test
1 2 3 4 5

Click Test Plan on the Workflow Bar. Click Test Plan View on the toolbar. Select a test-container node, test-folder node, or WSH test-definition node. Select the Properties tab in Test Plan View. In the WSH Test Properties area, click Edit (or click Edit on the toolbar) and proceed with the Creating Test Definitions procedure.

Related Concepts Windows Script Host Tests Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Editing Test Definitions Managing Test Plans Related Reference Test Plan Unit Interface

423

Finding and Replacing Test Definition Properties


The Test Plan units Find command enables you to locate test definition property values that meet specified search criteria. Use the Next, Previous, First, and Last functions to step through the results of a search for a specified property value. The Test Plan units Replace command further enables you to replace identified property values with new values. Find/Replace functions are enabled across all Test Manager plug-ins and functional categories. Note: Data-driven test property values can be found using the Find command, but they can not be replaced using the Replace command. Note: When the Test Plan tree is constrained by a filter, Find/Replace functions are only executed against those test definitions that are presented in the Test Plan tree after filtering.

To find a specific test definition property value:


1 2 3

Click Test Plan on the workflow bar. Click Find (binoculars icon) on the toolbar to open the Find dialog box. From the Category list box, select the functional category or Test Manager plug-in across which you want to search:

General test definition properties Manual steps Test definition parameters Test definition attributes SilkTest test properties SilkPerformer test properties .NET Explorer test properties JUnit test properties NUnit test properties Windows Scripting test properties Custom plug-in properties
4 5

From the Find in list box, specify the property within which the query should search for the value. The properties available in this list vary based on the selected category. In the Find what portion of the dialog box, enter an alphanumeric string to be submitted for the query. Optional settings are available for qualifying the query further. Check the check boxes of those that are appropriate:

Start from selection: Specifies that the search begin from the currently selected test plan element. Start from top: Specifies that the search begin from the root of the Test Plan tree. Find in subtree only: Specifies that the search only be run in the currently selected segment of the
Test Plan tree (the portion of the Test Plan tree that is available on the Contents tab.

Case sensitive: Specifies that the string be searched case-sensitively. Match whole word only: Specifies that search results only include complete standalone instances of
the query string.

424

Include read-only values: Specifies that search results include text strings that can not be directly

edited because they are inherited from another test definition, referenced from a linked test container, or called from a data source in the course of data-driven testing. Note: When using a case sensitive SQL Server, case-insensitive searching is not supported for the following fields: test definition description, manual step description, manual step action description, and manual step expected results.

6 7

Click Find to begin the search and advance to the first test plan element returned by the query (test container, test folder, or test definition). If your query returns multiple test plan elements, you will be presented with the option to advance through the elements using the following buttons on the Find menu:

Next: Advances the view to the next returned element. Previous: Advances the view to the last viewed element. First: Advances the view to the first returned element. Last: Advances the view to the last returned element. New Find: Cancels the current search and returns the view to the Find dialog box. Close: Closes the Find dialog box.
Note: The Find command allows you to search test plan elements where the search string is an inherited value. This option is not allowed with the Replace command.

To replace an identified test definition property value:


1 2 3

Click Test Plan on the workflow bar. Click Replace on the toolbar to open the Replace dialog box. From the Category list box, select the functional category or Test Manager plug-in across which you want to search:

General test definition properties Manual steps Test definition parameters Test definition attributes SilkTest test properties SilkPerformer test properties .NET Explorer test properties JUnit test properties NUnit test properties Windows Scripting test properties Custom plug-in properties

425

4 5

From the Find in list box, specify the property within which the query should search for the value. The properties available in this list vary based on the selected category. In the Find what portion of the dialog box, enter an alphanumeric string to be submitted for the query. Optional settings are available for qualifying the query further. Check the check boxes of those that are appropriate:

Start from selection: Specifies that the search begin from the currently selected test plan element. Start from top: Specifies that the search begin from the root of the Test Plan tree. Find in subtree only: Specifies that the search only be run in the currently selected segment of the
Test Plan tree (the portion of the Test Plan tree that is available on the Contents tab.

Case sensitive: Specifies that the string be searched case-sensitively. Match whole word only: Specifies that search results only include complete standalone instances of
the query string. Note: When using a case sensitive SQL Server, case-insensitive find/replace is not supported for the following fields: test definition description, manual step description, manual step action description, and manual step expected results.

6 7

In the Replace with text box, enter the alphanumeric string that is to replace instances of the queried string. Click Find to begin the search and advance to the first test plan element returned by the query (test container, test folder, or test definition). Or click Replace all to replace all instances of the queried string with the replacement string. If you select Find and the query returns multiple test plan elements, you will be presented with the option to advance through the elements using the following buttons on the Replace dialog box:

Find Next: Advances the view to the next returned element. Find Previous: Advances the view to the last viewed element. Replace: Replace the displayed instance of the queried string with the replacement string. Replace All: Replace all instances of the queried string with the replacement string. Close: Closes the Replace dialog box.

Warning: Data-driven settings and properties cannot be replaced. Related Concepts Test Plan Tree Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Related Reference Test Plan Toolbar Functions

426

Modifying Test Containers


To modify test container properties
1 2

Click Test Plan on the workflow bar. Select the test container that you are editing. Select the Properties tab in Test Plan View. Beneath the containers property fields, click Edit to open the Edit Test Container dialog box. You can also click Edit on the toolbar to open the Edit Test Container dialog box. Add a new test container. Click OK to accept your changes.

3 4 5

Related Concepts Test Plan Tree Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Adding Test Containers Managing Test Plans Related Reference Test Plan Toolbar Functions

427

Modifying Test Folders


To modify test folder properties
1 2 3 4 5 6

Click Test Plan on the workflow bar. Select the test folder that you want to edit. Select the Properties tab in Test Plan View. Beneath the folder Name/Description details, click Edit to open the Edit Test Folder dialog box. You can also click Edit on the toolbar to open the Edit Test Folder dialog box. Edit the name and description of the folder as required. Click OK to accept your changes.

Related Concepts Test Plan Tree Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Adding Test Folders Managing Test Plans Related Reference Test Plan Toolbar Functions

428

Set a Test Plan Node as Integration Default for External Agile Planning Tools
To use the Web service calls to create tests in Test Manager through an external agile planning tool, you have to set a folder or container in the test plan tree as the integration default node, where the Web service will create the test. If you do not specify the integration default node, an error message box displays.

To specify the integration default node in the test plan tree:


1 2 3

Click Test Plan on the workflow bar. Right-click the folder or container in the test plan tree which you want to set as the integration default node. Choose Set as Integration Default. Note: If an integration default node already exists, the default node is changed to the new node.

The integration default node is set to the selected node, enabling the agile planning tool to create tests at this location. Note: The integration default node is shown in the Properties page of the project, in which the node is located.

Related Concepts Test Plan Tree Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Related Reference Test Plan Toolbar Functions

429

Working with Attachments


This section explains how to work with attachments in Test Manager. In This Section Deleting Attachments from Test Plan Elements Describes how to delete an attachment from a test plan element. Attaching Files to Test Plan Elements Describes how to attach a file to a test plan element. Attaching Links to Test Plan Elements Describes how to attach a link to a test plan element. Editing Attachment Descriptions Describes how to edit an attachment description. Viewing Test Plan Attachments Describes how to view a test plan attachment.

430

Deleting Attachments from Test Plan Elements


To delete an attachment from a test plan element
1 2 3 4 5 6

Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select the element for which you are deleting an attachment. Select the Attachments tab to see a list of all attachments that are associated with the element. Click the delete icon of the attachment you want to delete. Click Yes on the confirmation dialog box to delete the attachment from the project. Note: Only one attachment at a time can be deleted.

Related Concepts Attachments Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Working with Attachments Managing Test Plans Related Reference Test Plan Attachments tab

431

Attaching Files to Test Plan Elements


To attach a file to a test plan element
1 2 3 4 5 6 7 8

Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select a container, folder, or test definition. Select the Attachments tab. Click Upload File to open the Upload File dialog box. Using Browse, select a file from your local file system. Enter a meaningful description for the attachment. Click Upload File to upload the attachment to the server and associate it with the selected element.

Note: Attaching files to a test plan element may not work in Mozilla Firefox. Firefox requires usage of three slashes (for example: "file:///") for a file link, while other browsers require only two (for example: "file://"). Aditionally, Firefox includes a security feature blocking links from remote files to local files and directories. For more information, see http://kb.mozillazine.org/Firefox_:_Issues_:_Links_to_Local_Pages_Don't_Work Related Concepts Attachments Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Working with Attachments Managing Test Plans Related Reference Test Plan Attachments tab

432

Attaching Links to Test Plan Elements


To attach a link to a test plan element
1 2 3 4 5 6 7 8

Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select a container, folder, or test definition. Select the Attachments tab. Click Attach Link to open the Attach URL dialog box. Enter a URL in the URL field. Enter a meaningful description for the attached link. Click Attach URL to associate the link with the selected element.

Related Concepts Attachments Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Working with Attachments Managing Test Plans Related Reference Test Plan Attachments tab

433

Editing Attachment Descriptions


To edit an attachment description
1 2 3 4 5 6 7

Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select the element for which you want to edit an attachment description. Select the Attachments tab to see a list of all attachments that are associated with the element. Click the edit icon of the attachment for which you want to edit the description. Edit the description on the Edit File Attachment dialog box. Click OK to save your changes.

Related Concepts Attachments Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Working with Attachments Managing Test Plans Related Reference Test Plan Attachments tab

434

Viewing Test Plan Attachments


Attached files and links are listed on the Attachments tab for the selected test plan element. Attachments are displayed in the order in which they are uploaded, though the list of attachments can be sorted by Name, Created On, and Created by properties. To display attachments that are associated with child elements of the selected element, check the Include Child Attachments check box.

To view a test plan attachment


1 2

From Test Plan View, select the element for which you want to view an attachment. Select the Attachments tab to see a list of all attachments that are associated with the element. Each attachment name serves as a link. File-attachment links open a Save As dialog box, enabling you to download the attachment to your local file system. Link-attachments link directly to the link destinations in a new browser window.

Related Concepts Attachments Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Working with Attachments Managing Test Plans Related Reference Test Plan Attachments tab

435

Working with Data-Driven Tests


This section explains how to work with data-driven tests in Test Manager. In This Section Adding a Data Source Value to a Manual Test Step Describes how to add a data source value to a manual test step. Creating Data-Driven Test Definitions Describes how to create a data-driven test definition. Downloading CSV Data From a Data Source Describes how to download CSV data from a data source. Editing Data-Driven Properties Describes how to edit data-driven properties.

436

Adding a Data Source Value to a Manual Test Step


To add a data source value to a manual test step:
1 2

Click Test Plan on the workflow bar. Create a new data-driven test definition (select Manual as the test type and configure test steps). Note: To view the values included in your data source, click the Data Set tab of your test definition.

3 4 5

Select the Steps tab of your test definition. Click the Edit Test Step icon in the Actions column of the test step that is to reference the data source value. In the Action description text box, enter a parameter that references the relevant column in your data source, using the syntax ${<column name>}. For example, if you want a test step to retrieve password parameters from a spreadsheet that has a column called Password, you would write the parameter as ${Password}. When you execute the manual test step, the parameter is replaced by an actual value in the corresponding data-driven data source.

Related Concepts Manual Tests Manual Test Definitions Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Creating Data-Driven Test Definitions Working with Manual Tests Working with Data-Driven Tests Working with Manual Tests Managing Test Plans Related Reference Test Plan Unit Interface

437

Creating Data-Driven Test Definitions


To create a data-driven test definition
1

Click Test Plan on the workflow bar. Create a new test definition. See the topic, Creating a Test Definition for information about creating a test definition.

Select the newly created test definition's Properties tab. Scroll down to the Data-driven Properties section of the Properties tab and select the Edit icon to open the Data-driven Properties dialog box. Select a pre-configured data source from the Data Source list box. See SilkCentral Administration Module documentation for information on configuring data sources. Click Next to continue. Select a data set from the Data Set list box (in the case of Excel data sources, this is a worksheet name. In the case of database data sources, this is a table name). Check the Each data row is a single test definition check box to have each row in your data set considered to be a separate test definition, or do not check this check box to create a single test definition for all data rows of your data set. (optional) You can enter a SQL query into the Filter query field to filter your data set based on a SQL-syntax query. Note: Only simple WHERE clause queries are supported.

3 4 5 6

8 9

Check the Enable data-driven properties check box to enable data-driven functionality. Click Finish to save your settings. Note: Note: Data-driven property settings are visible in the lower portion of each test definitions Properties tab. To use Test Manager's data-driven test functionality with SilkPerformer scripts, data sources with column names matching the corresponding SilkPerformer project attributes must be used in conjunction with "AttributeGet" methods.

Related Concepts Manual Tests SilkTest Test Definitions Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Working with Data-Driven Tests Managing Test Plans Related Reference Test Plan Data Set tab

438

Downloading CSV Data From a Data Source


To download CSV data from a data source
1 2 3 4 5 6

Click Test Plan on the workflow bar. Select a test definition that relies on the data source from which you want to download data. Select the Properties tab. Click the Download button (in the Actions column) of either the data source or the data set (depending on which entity contains the data you want to download). Specify the location on your local system to where the data is to be downloaded. Click OK to download the data in CSV format.

Related Concepts Manual Tests Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Creating Data-Driven Test Definitions Working with Data-Driven Tests Managing Test Plans Related Reference Test Plan Data Set tab

439

Editing Data-Driven Properties


Editing data-driven properties
1 2 3 4 5 6

Click Test Plan on the workflow bar. Select the test definition that has the property you want to edit. Select the Properties tab. Select the Edit icon that corresponds to the property you are editing (in the Actions column). Edit the property as required. Click OK to save your changes.

Related Concepts SilkTest Test Definitions Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Creating Data-Driven Test Definitions Working with Data-Driven Tests Managing Test Plans Related Reference Test Plan Properties tab

440

Working with Manual Tests


This section explains how to work with manual tests inTest Manager. In This Section Converting Manual Test Definitions to Automated Tests Describes how to convert a manual test definition to an automated test. Editing Manual Test Steps From Within Test Manager Describes how to edit a manual test step through the Test Manager Web interface.

441

Converting Manual Test Definitions to Automated Tests


To convert a manual test definition to an automated test:
1

Click Test Plan on the workflow bar. Right-click a manual test definition in the Test Plan tree and select Automate with... . Select one of the following test types from the list:

SilkTest Test SilkPerformer Test NUnit Test Windows Scripting Test JUnit Test SilkTest Plan .NET Explorer Test ProcessExecutor Test
Depending on the test type you select, the appropriate properties dialog box opens.
3

Proceed to the appropriate topic in Help for information on filling out the dialog:

If you are converting to a SilkTest test, proceed to Configuring a SilkTest Test. If you are converting to a SilkPerformer test, proceed to Configuring a SilkPerformer Test. If you are converting to a NUnit test, proceed to Configuring an NUnit Test. If you are converting to a Windows scripting test, proceed to Configuring a Windows Scripting Test. If you are converting to a JUnit test, proceed to Configuring a JUnit Test. If you are converting to a SilkTest plan test, proceed to Configuring a SilkTest plan Test. If you are converting to a .NET Explorer test, proceed to Configuring a .NET Explorer Test.

Related Procedures Managing Test Plans - Quick Start Task Configuring SilkTest Test Properties Configuring SilkPerformer Test Properties Configuring NUnit Test Properties Configuring Windows Scripting Test Properties Configuring JUnit Test Properties Configuring SilkTest Plan Properties Configuring .Net Explorer Test Properties

442

Editing Manual Test Steps From Within Test Manager


Tip: The Manual Testing Client, an Eclipse-based client tool, is the recommended tool for executing manual tests with Test Manager. The Manual Testing Client is a separate executable from Test Manager. To install the Manual Testing Client, navigate to Help Tools Manual Testing Client in SilkCentral Test Manager.

To edit a manual test step:


1 2 3 4 5

Within Test Manager, click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select the node of the manual test for which you are editing a test step. Select the Steps tab. Do one of the following to open the Edit Manual Test Definition Step dialog box:

Press F2. Press ALT and double-click the step you want to edit. In the Actions column of the step you want to edit, click Edit Test Step.
6

Edit step details as required. Note: Note: Values from data sources can be inserted into manual test steps in the form of parameters. Test Manager supports HTML formatting and cutting and pasting of HTML content for Description fields.

Click OK to save your changes.

Related Concepts Manual Tests Test Definitions in the Manual Testing Client Test Plan Management Manual Testing Client Related Procedures Managing Test Plans - Quick Start Task Creating Test Definitions Adding a Data Source Value to a Manual Test Step Working with Manual Tests Managing Test Plans Related Reference Current Run Page HTML Support for Description Text Boxes Multi-Select Functionality for Test Plan Elements

443

Working With Test Definitions in Grid View


This section explains how to work with test definitions in Test Manager Grid View. In This Section Creating an Execution Definition in Grid View Describes how to create an execution definition in Grid View. Displaying/Hiding Columns in Grid View Describes how to display/hide columns in Grid-View. Filtering Test Definitions in Grid View Describes how to filter test definitions in Grid View. Grouping Test Definitions in Grid View Describes how to group test definitions for easier viewing in Grid View. Linking to Test Definitions from Grid View Describes how to link to a test definition in Test Plan view directly from Grid View. Removing Grid View Filters Explains how to remove filters that have been applied to columns in Grid View. Reordering Columns in Grid View Describes how to reorder columns in Grid View. Resizing Columns in Grid View Describes how to change the width of Grid-View columns. Restoring Default Grid View Settings Explains how to restore the default Grid View settings for your project. Sorting Test Definitions in Grid View Describes how to sort test definitions in Grid View.

444

Creating an Execution Definition in Grid View


To create an execution definition in Grid View:
1 2 3 4

Click Test Plan on the workflow bar. Click Grid View on the toolbar to display Grid View. Select the test definitions you want to assign to your execution definition, by using the multi-select feature of the Grid View. Right-click the test definitions and choose Create Execution Definition.

The New Execution Definition dialog box displays. Enter the specifications of your new execution definition. Note: All selected test definitions must be in the same container. If not, the execution definition is not created and an error message displays. Note: The test container is preselected in the New Execution Definition dialog box and can not be altered. Related Concepts Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Working With Test Definitions in Grid View Working with Filters Adding Execution Definitions Related Reference Test Plan Grid View Test Plan Unit Interface

445

Displaying/Hiding Columns in Grid View


To display/hide columns in Grid View:
1 2 3 4 5

Click Test Plan on the workflow bar. Click Grid View on the toolbar to display Grid View. Right-click a column header. Expand the Columns submenus to view all the columns that are available in the project. Check the check boxes of all the columns you want to have displayed in Grid View. Your column-display preferences will be saved and displayed each time you open the active project.

Related Concepts Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Working With Test Definitions in Grid View Working with Filters Related Reference Test Plan Grid View Test Plan Unit Interface

446

Filtering Test Definitions in Grid View


You can filter the test-definition list based on column values. You can specify filter strings to be applied to text-based data fields, calendar filters (using Before, After, or On operators) for date-based fields, and numerical operators (>, <, and =) for number-based fields.

To filter text-based values in Grid View:


1 2 3 4 5 6

Click Test Plan on the workflow bar. Click Grid View on the toolbar to display Grid View. Right-click the header of the text-based column that the filter is to be based on. Expand the Filters submenu on the context menu to display the Filters text box Enter a text string into the text box. Press ENTER. All test definitions that match the filter criteria (for example, in the case of test definition names, all test-definition names that include the specified string) are then dynamically displayed in the filtered list.

To filter date-based values in Grid View:


1 2 3 4 5

Click Test Plan on the workflow bar. Click Grid View on the toolbar to display Grid View. Right-click the header of the date-based column that the filter is to be based on. Hold your cursor over Filter on the context menu to display the Before, After, and On submenu. Hold your cursor over After to define a date before which (and including) all test definitions should be excluded. Hold your cursor over Before to define a date after which (and including) all test definitions should be excluded. Hold your cursor over On to exclude all test definitions except those that have the specified date. The calendar tool displays. Select a date using the calendar tool (or click Today to specify today's date). Tip: You must explicitly click a date on the calendar tool or click ENTER to activate date-based filtering changes.

All test definitions that match the filter criteria are then dynamically displayed in the filtered list.

To filter number-based values in Grid View:


1 2 3 4 5

Click Test Plan on the workflow bar. Click Grid View on the toolbar to display Grid View. Right-click the header of the number-based column that the filter is to be based on. Expand the Filters submenu on the context menu to display the > (greater than), < (less than), and = (equals) operators. Enter a number in the > text box to define a number less than which (and including) all test definitions should be excluded. Enter a number in the < text box to define a number greater than which (and including) all test definitions should be excluded. Enter a number in the = text box to exclude all test definitions except those that have the specified number.

447

Note:
6

Number values are rounded to two decimal places.

Press ENTER. All test definitions that match the filter criteria are then dynamically displayed in the filtered list.

To filter Boolean values in Grid View:


1 2 3 4 5

Click Test Plan on the workflow bar. Click Grid View on the toolbar to display Grid View. Right-click the header of the Boolean-based column that the filter is to be based on. Expand the Filters submenu on the context menu to display the available values. Click one of the Yes or No option buttons. All test definitions that match the filter criteria are then dynamically displayed in the filtered list.

To filter values using a predefined list in Grid View:


1 2 3 4 5

Click Test Plan on the workflow bar. Click Grid View on the toolbar to display Grid View. Right-click the header of the column that has a predefined filter value (for example, NodeType) that the filter is to be based on. Expand the Filters submenu on the context menu to display the available values. Check the check boxes of the filter values that you are interested in. All test definitions having one of the selected criteria will be displayed.

Related Concepts Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Working With Test Definitions in Grid View Working with Filters Related Reference Test Plan Grid View Test Plan Unit Interface

448

Grouping Test Definitions in Grid View


Beyond simply sorting by column, you can chunk test definitions into groups to facilitate viewing. Groups are based on commonly-shared values within the column that grouping is based on.

To group test definitions in Grid View:


1 2 3 4

Click Test Plan on the workflow bar. Click Grid View on the toolbar to display Grid View. Right-click the header of the column that the sort is to be based on. Select Group by This Field. Test definitions are then organized into groups based on commonly-shared values within the column you have selected.

To remove test definition grouping:


1 2 3 4

Click Test Plan on the workflow bar. Click Grid View on the toolbar to display Grid View. Right-click any column. Uncheck the Show in Groups check box.

Related Concepts Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Working With Test Definitions in Grid View Working with Filters Related Reference Test Plan Grid View Test Plan Unit Interface

449

Linking to Test Definitions from Grid View


You can link to a test definition in Test Plan view directly from the Grid View.

To link to a test definition's Properties tab from Grid View:


1 2 3 4

Click Test Plan on the workflow bar. Click Grid View on the toolbar to display Grid View. Right-click a test definition row. Select Go to test to advance to the node of the test definition in Test Plan view. Note: Alternatively, you can click a test's ID link in Grid View to advance to the associated test definition.

Related Concepts Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Working With Test Definitions in Grid View Working with Filters Related Reference Test Plan Grid View Test Plan Unit Interface

450

Removing Grid View Filters


Note: Hiding a column removes all filters that have been applied to the column.

To remove a specific Grid View filter:


1 2

Click Test Plan on the workflow bar. Click Grid View on the toolbar to display Grid View. Note: You can identify filtered columns by their titles, which are displayed in bold, italic text.

3 4

Right-click the header of the column that has the filter you want to remove. Uncheck the Filters check box.

To remove all Grid View filters:


1 2 3 4

Click Test Plan on the workflow bar. Click Grid View on the toolbar to display Grid View. Right-click any column header. Select Reset Filters.

Related Concepts Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Working With Test Definitions in Grid View Working with Filters Creating a Filter for a Folder or Container Related Reference Test Plan Grid View Test Plan Unit Interface

451

Reordering Columns in Grid View


To reorder columns in Grid View:
1 2 3 4

Click Test Plan on the workflow bar. Click Grid View on the toolbar to display Grid View. Select the column header of the column you want to move. Drag the column to the desired position and release it. Your column-order preferences will be saved and displayed each time you open the active project.

Related Concepts Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Working With Test Definitions in Grid View Working with Filters Related Reference Test Plan Grid View Test Plan Unit Interface

452

Resizing Columns in Grid View


To adjust the width of a Grid-View column:
1 2 3 4

Click Test Plan on the workflow bar. Click Grid View on the toolbar to display Grid View. Select the vertical column-header divider of the column you want to adjust. Drag the column boundary to the desired position and release it. Your column-width preferences will be saved and displayed each time you open the active project.

Related Concepts Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Working With Test Definitions in Grid View Working with Filters Related Reference Test Plan Grid View Test Plan Unit Interface

453

Restoring Default Grid View Settings


Restoring default Grid View resets all user-defined settings (column order, column width, shown/hidden columns, applied filters, sorting, and grouping) for the current project.

To restore default Grid View settings:


1 2 3 4

Click Test Plan on the workflow bar. Click Grid View on the toolbar to display Grid View. Right-click any column header. Select Reset View.

Related Concepts Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Working With Test Definitions in Grid View Working with Filters Related Reference Test Plan Grid View Test Plan Unit Interface

454

Sorting Test Definitions in Grid View


To sort test definitions in Grid View:
1 2 3 4

Click Test Plan on the workflow bar. Click Grid View on the toolbar to display Grid View. Right-click the header of the column you want the test definitions to be sorted by. Select Sort Ascending to have the test definitions sorted in ascending order (or select Sort Descending to have the test definitions sorted in descending order). Your sort preferences will be saved and displayed each time you open the active project.

Related Concepts Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Working With Test Definitions in Grid View Working with Filters Related Reference Test Plan Grid View Test Plan Unit Interface

455

Creating a Filter for a Folder or Container


To create a filter for a folder or container:
1 2 3

Click Test Plan on the workflow bar. Select Document View or Test Plan View from the toolbar Right-click the folder or container you want to filter and choose Filter Subtree.

Note: To remove filtering and display all elements, select <No Filter> from the Filter list box on the toolbar. Note: Empty folders are not shown in the filtered subtree. Related Concepts Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Related Reference Test Plan Toolbar Functions Test Plan Unit Interface

456

Expanding/Collapsing the Test Plan tree


You can consolidate levels of the Test Plan tree or display all levels of the tree based on your viewing needs.

To collapse or expand levels of the tree:


1 2 3

Click Test Plan on the Workflow Bar. Right-click within the Test Plan tree. Select a collapse or expand option.

Related Concepts Test Plan Tree Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Creating Test Definitions Managing Test Plans Related Reference Test Plan Unit Interface

457

Tracking Test Plan History


To view a test plan elements history
1 2 3 4 5

Click Test Plan on the workflow bar. Select a container, folder, or test definition in the test plan tree. Click Test Plan View in the toolbar. Select the History tab. The properties of all elements are then displayed in tabular format.

Related Concepts Recent Changes Change-Notification Emails Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Managing Test Plans Related Reference Requirement History tab Test Plan Toolbar Functions Test Plan Unit Interface

458

Updating Execution Definitions


To display current execution-definition content and the latest test-plan filter results
1 2 3

Click Test Plan on the workflow bar. Select the project node or a test container node in the Test Plan tree. Click Update Execution on the toolbar. Note: Alternatively you can right-click a test container node or the project node and select Update Executions .

The Executions list is then updated.

Related Concepts Execution Definitions Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Managing Test Plans Related Reference Test Plan Toolbar Functions Test Plan Assigned Executions tab

459

Using Upload Manager


Upload Manager is available for download through Test Managers Tools page (Help Tools Downloadable Client Tools). Once downloaded to your local system, Upload Manager can be started directly as a standalone application, independent of Test Manager.

To download and start Upload Manager:


1 2 3 4

Go to Help

Tools

Downloadable Client Tools and click the Upload Manager link.

Click the Upload Manager link and save SetupUploadManager.exe to your local system using your browser's download dialog. Double-click SetupUploadManager.exe to start the InstallShield Wizard for Upload Manager. Follow the InstallShield Wizards prompts, entering your name, company name, and target destination for the installation. Click Finish to complete the installation.

To upload a file to the SilkCentral file pool


1 2

Start Upload Manager by double-clicking the applications executable file (UploadManager.exe). The Select Target Location dialog box displays. Select the option to upload the file to the SilkCentral (server file pool). Click Next. Click Add (unless the file you want to upload is already visible in the Filename field) to browse to and select the file that you are uploading. Note: It is not possible to add file descriptions when uploading files to the file pool.

4 5

When the file that you want to upload displays in the Filename field, click Next. Enter the connection parameters for your SilkCentral installation, beginning with the Hostname of the computer that hosts your Test Manager installation (note that the name should not include a protocol designation). Enter the installations Port. Check the Secure check box if the connection is a secure connection (such as HTTPS). Enter the Username and Password login credentials that are required for your SilkCentral server. (Optional) Click Set as Default to have these parameters presented to you automatically the next time you run Upload Manager. (Optional) Click Internet Options to configure Internet settings for your connection (for example, proxy server settings).

Click Next. Verify all of the information you have entered in the Upload files field. Click Back if you need to make any changes on a previous page. Accept the default setting for the Close this window when the upload is complete check box. Click Finish to begin the upload process. When the upload is complete, Upload Manager will close and the uploaded file will be available in the SilkCentral file pool.

460

Related Concepts Upload Manager Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Creating New Issues Creating Test Definitions Managing Test Plans

461

Viewing Assigned Executions


To view a list of executions that are assigned to a test plan
1 2 3

Click Test Plan on the workflow bar. Select the test plan for which you want to view the assigned executions. Select the Assigned Executions tab to view the complete list of executions that are assigned to the selected test plan.

Related Concepts Execution Definitions Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Managing Test Plans Related Reference Test Plan Assigned Executions tab

462

Viewing Recent Changes


To view recent changes to requirements or test definitions
1 2

Click Test Plan (or Requirements) on the workflow bar. Click Show Changes to filter out all requirements, test definitions, folders, and containers except those that have been changed since your last change acknowledgement (note that the recent changes filter is selected automatically in the Filter list box). Recent-change filtering is active across the Test Plan View tabs and Document View. Note: Once the recent changes filter has been activated, click Show Changes to toggles the Show All mode. Click Show Changes again to remove filtering and see all test definitions in the tree view.

When you have reviewed the changes, you can accept them by clicking Acknowledge. The acknowledge function resets the recent changes filter. Note: All test-plan changes generate time-stamped entries in the test plan history.

Related Concepts Recent Changes Change-Notification Emails Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Related Reference Requirement History tab Test Plan Toolbar Functions Test Plan Unit Interface

463

Executing Test Definitions


This section explains how to execute test definitions with Test Manager. In This Section Analyzing Test Runs This section explains how to analyze test runs with Test Manager. Assigning Test Definitions to Execution Definitions This section explains assign test definitions to execution definitions. Configuring Deployment Environments This section explains how to configure deployment environments with Test Manager. Configuring Execution Dependencies This section explains how to configure execution dependencies with Test Manager. Defining Execution Definition Schedules This section explains how to schedule execution definitions in Test Manager. Executing Manual Tests This section explains how to execute manual tests with Test Manager. Running Automated Tests This section explains how to run automated tests with Test Manager. Working with Execution Definitions This section explains how to work with execution definitions in Test Manager. Working with SilkPerformer Projects This section explains how to work with SilkPerformer projects within Test Manager. Collapsing or Expanding the Execution Tree How to collapse and expand the Execution Tree. Configuring Setup and Cleanup Executions Describes how to configure setup and cleanup executions. Creating Data-Driven Execution Definitions Describes how to create a data-driven execution definition.

464

Analyzing Test Runs


This section explains how to analyze test runs with Test Manager. In This Section Changing the Status of a Test Execution Run Describes how to manually change the status of a specific test definition run. Deleting Individual Test Run Results Describes how to delete the results of a specific test run. Deleting the Results of an Execution Definition Describes how to delete some or all of an execution definition's run results. Viewing Test Execution Details Describes how to view test execution details.

465

Changing the Status of a Test Execution Run


You can manually change the status of a specific test definition run.

To manually change the status of a test execution:


1 2 3 4 5 6 7 8 9

Click Execution on the workflow bar. Select an execution definition in the Execution tree view. Click the Runs tab. Select the execution definition run. The test definition section of the Runs page lists the test definition runs. Click on the Run ID of the test definition. The Test Definition Run Results dialog box displays. On the Details page, click Change Status to open the Change Status dialog box. Select the new status for the test definition run from the New Status list box. Type an explanation for the manual status change in the Comment text box. Note that inserting a comment is mandatory. Click OK to confirm the status change.

Note: Status changes produce history changes. To view the history of all status changes for the test definition execution run, click the Messages tab in the Test Definition Run Results dialog box. Related Concepts Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Creating Test Definitions Working with Data-Driven Tests Executing Test Definitions Related Reference Test Definition Run Results Dialog Execution Runs Tab

466

Deleting Individual Test Run Results


To delete the results of a specific test run:
1 2 3 4 5

Click Execution on the workflow bar. Select an execution definition in the Execution tree view. Select the Runs tab. Click Delete (in the Actions column) of the execution run for which you want to delete results. Click Yes on the subsequent confirmation dialog box to complete the deletion.

Note: To delete some or all of the results of an entire execution definition, click Delete Results in the lower pane. Related Concepts Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Creating Test Definitions Deleting the Results of an Execution Definition Working with Data-Driven Tests Executing Test Definitions Related Reference Test Definition Run Results Dialog Execution Runs Tab

467

Deleting the Results of an Execution Definition


To delete some or all of an execution definition's run results:
1 2 3 4 5

Click Execution on the workflow bar. Select an execution definition in the Execution tree view. Select the Runs tab. Click Delete Results in the lower pane (note that this button is only available when results are available for deletion). The Delete Results dialog box displays. Specify which results you want to delete:

All runs except the last run deletes the results of all execution definition runs except the results
of the most recent run.

All runs within the time span allows you to define a specific time span during which run results
are to be deleted. With this option selected, click the calendar tool.
6

next to the From and To fields to specify dates using

Click OK to confirm the deletion. to delete the results of the run.

Note: In the Actions column of a specific execution run, click Related Concepts Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Creating Test Definitions Deleting Individual Test Run Results Working with Data-Driven Tests Executing Test Definitions Related Reference Test Definition Run Results Dialog Execution Runs Tab

468

Viewing Test Execution Details


To view the details of a test execution
1 2 3 4 5

Click Execution on the workflow bar. Select an execution definition in the Execution tree view. Select the Runs tab. Click the Run ID of the execution for which you want to see details. Detailed information about the results of the execution definition is displayed.

Related Concepts Test Definition Execution Execution Definition Run Results Dialog Related Procedures Managing Test Executions - Quick Start Task Creating Test Definitions Working with Data-Driven Tests Executing Test Definitions Related Reference Test Definition Run Results Dialog Execution Runs Tab

469

Assigning Test Definitions to Execution Definitions


This section explains how to assign test definitions to execution definitions, locate assigned test definitions for a given execution definition, and remove assigned test definitions from an execution definition. In This Section Locating Test Definitions Assigned to Execution Definitions Describes how to locate manually assigned test definitions in the Available Test Definitions tree. Removing Test Definition Assignments Describes how to remove test-definition assignments from execution definitions. Assign Test Definitions from Grid View to Execution Definitions Describes how to assign test definitions from Grid View to execution definitions. Creating an Execution Definition in Grid View Describes how to create an execution definition in Grid View. Manually Assigning Test Definitions to Execution Definitions Describes how to manually assign test definitions to execution definitions. Using a Filter to Assign Test Definitions to Execution Definitions Describes how to use a filter to assign test definitions to execution definitions.

470

Locating Test Definitions Assigned to Execution Definitions


To locate manually assigned test definitions in the Available Test Definitions tree
1 2 3 4 5

Click Execution on the workflow bar. Select an execution definition. Select the Assigned Test Definitions tab. In the Actions column of a test definition, click is stored in. to find out in which test folder or container the test definition

The corresponding parent folder is then expanded and the assigned test definition is highlighted in blue.

Related Concepts Test Definition Execution Execution Definition Schedules Related Procedures Managing Test Executions - Quick Start Task Adding Execution Definitions Executing Test Definitions Related Reference Execution Assigned Test Definitions Tab

471

Removing Test Definition Assignments


Manually assigned test definitions can be removed from an execution definition.

To remove a manually assigned test definition


1 2 3 4

Click Execution on the workflow bar. Select an execution definition. Select the Assigned Test Definitions tab. In the Actions column, click the delete button (resembles an X mark) of the assigned test definition you are deleting. Repeat this step for all assignments that you want to delete. Tip: To remove all assigned test definitions, click Remove All.

Click Apply to save the assigned test-definition list. Note: If you do not click Apply, changes you make to the Assigned Test Definitions will be lost.

Related Concepts Test Definition Execution Execution Definition Schedules Related Procedures Managing Test Executions - Quick Start Task Adding Execution Definitions Executing Test Definitions Related Reference Execution Assigned Test Definitions Tab

472

Assign Test Definitions from Grid View to Execution Definitions


The test definitions that are assigned to the selected execution definitions are listed on the Assigned Test Definitions tab (Execution View only). The properties of the assigned test definitions that are listed in tabular view include:

Test Definition Name Test Definition Status Last Execution of the test definition

To assign one or more test definitions from the test plan Grid View to one or more execution definitions:
1 2 3 4 5 6 7 8

Click Test Plan on the workflow bar. Click Grid View on the toolbar Select the test definitions you want to assign to execution definitions. You can use your keyboard's Ctrl and Shift keys to select multiple test definitions using standard browser multi-select functions. Right-click the selected test definitions and choose Save Selection. Click Execution on the workflow bar. Select the execution definition to which you want to assign the selected test definitions. Choose Assigned Test Definitions. Click Assign Saved Selection. Note: Note: Only test definitions that reside in the execution definitions test container are inserted. You can insert the selected test definitions to more than one execution definitions. You can not insert them into requirements in a different project. The selection persists until you make a different selection or close Test Manager.

Related Concepts Test Definition Execution Execution Definition Schedules Related Procedures Managing Test Executions - Quick Start Task Adding Execution Definitions Executing Test Definitions Using a Filter to Assign Test Definitions to Execution Definitions Manually Assigning Test Definitions to Execution Definitions Related Reference Execution Assigned Test Definitions Tab

473

Creating an Execution Definition in Grid View


To create an execution definition in Grid View:
1 2 3 4

Click Test Plan on the workflow bar. Click Grid View on the toolbar to display Grid View. Select the test definitions you want to assign to your execution definition, by using the multi-select feature of the Grid View. Right-click the test definitions and choose Create Execution Definition.

The New Execution Definition dialog box displays. Enter the specifications of your new execution definition. Note: All selected test definitions must be in the same container. If not, the execution definition is not created and an error message displays. Note: The test container is preselected in the New Execution Definition dialog box and can not be altered. Related Concepts Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Working With Test Definitions in Grid View Working with Filters Adding Execution Definitions Related Reference Test Plan Grid View Test Plan Unit Interface

474

Manually Assigning Test Definitions to Execution Definitions


The test definitions that are assigned to the selected execution definition are listed on the Assigned Test Definitions tab (Execution View only). The properties of the assigned test definitions that are listed in tabular view include:

Test Definition Name Test Definition Status Last Execution of the test definition

To manually assign test definitions to an execution definition


1 2 3 4

Click Execution on the workflow bar. Select the execution definition to which you are assigning test definitions. Select the Assigned Test Definitions tab. All test definitions of the test container which is associated with the selected execution are displayed in the Available Test Definitions window. Click the assign arrow of any test definition that you want to assign to the currently selected execution definition. Clicking the assign arrow of a folder or the top-level container assigns all child test definitions of that parent to the selected execution definition. Click Apply to save the assigned test-definition list. Note: If you do not click Apply, changes you make to the Assigned Test Definitions will be lost.

Related Concepts Test Definition Execution Execution Definition Schedules Related Procedures Managing Test Executions - Quick Start Task Adding Execution Definitions Executing Test Definitions Assign Test Definitions from Grid View to Execution Definitions Using a Filter to Assign Test Definitions to Execution Definitions Related Reference Execution Assigned Test Definitions Tab

475

Using a Filter to Assign Test Definitions to Execution Definitions


The test definitions that are assigned to the selected execution definition are listed on the Assigned Test Definitions tab (Execution View only). The properties of the assigned test definitions that are listed in tabular view include:

Test Definition Name Test Definition Status Last Execution of the test definition

To use a filter to assign test definitions to an execution definition:


1 2 3 4 5 6 7

Create a filter in the Test Plan unit. Refer to the Creating Filters procedure for details. If the filter already exists, skip this step. Click Execution on the workflow bar. Select the execution definition to which you are assigning test definitions. Select the Assigned Test Definitions tab. All test definitions of the test container which is associated with the selected execution are displayed in the Available Test Definitions window. Select By Filter from the test definition assignment types. Choose the filter from the list box. Click Apply to save the assigned test-definition list. Note: If you do not click Apply, changes you make to the Assigned Test Definitions will be lost.

If you assign test definitions to an execution definition in Test Plan Grid View , the test definition assignment type is automatically set to Manual, but the previously-filtered test definitions remain in the Assigned Test Definitions tab.

Related Concepts Test Definition Execution Execution Definition Schedules Related Procedures Managing Test Executions - Quick Start Task Adding Execution Definitions Executing Test Definitions Creating Filters Assign Test Definitions from Grid View to Execution Definitions Manually Assigning Test Definitions to Execution Definitions Related Reference Execution Assigned Test Definitions Tab

476

Configuring Deployment Environments


This section explains how to configure deployment environments with Test Manager. In This Section Adding a SilkTest AUT Host Describes how to add an SilkTest AUT host. Removing a Tester Assignment from an Execution Definition Describes how to remove a tester assignment from an execution definition. Adding Manual Testers Describes how to add a manual tester. Assigning Keywords to Execution Definitions Describes how to assign keywords to execution definitions. Creating New Keywords Describes how to create new keywords. Removing Keywords from Execution Definitions Describes how to remove keywords from execution definitions.

477

Adding a SilkTest AUT Host


For execution definitions that run SilkTest tests, you may have a setup where the SilkTest agent is on a different computer than the execution server. In this case, you can define the location of the SilkTest agent (SilkTest AUT (Application Under Test) Hostname).

To add or edit a SilkTest AUT host to the selected execution definition


1 2 3 4 5

Click Execution on the workflow bar. Select the execution definition for which you want to assign the SilkTest AUT host. Select the Deployment tab. Click Edit in the SilkTest AUT Hostname area of the GUI. The Edit SilkTest AUT Hostname dialog box displays. In the Hostname field, type the name of the computer where the SilkTest agent runs. Proper configuration of option files is required. See SilkTest documentation regarding the command-line option -m for details. Click OK to add the SilkTest AUT host to the selected execution definition.

Related Concepts Specifying Agent Under Test (AUT) SilkTest Tests Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Executing Test Definitions Related Reference Execution Deployment tab

478

Removing a Tester Assignment from an Execution Definition


To remove a tester assignment from the selected execution definition
1 2 3 4 5 6

Click Execution on the workflow bar. Select the execution definition for which you are removing a tester assignment. Select the Deployment tab. Click Edit in the Manual Testers area. The Manual Testers dialog box displays. All testers that have been assigned to the selected execution definition are listed in the Selected column. Select the name of the assigned user that you want to remove and click Remove to remove the user from the Selected list; or click Remove All to remove all tester assignments for the execution definition.

Related Concepts Test Definition Execution Manual Test Definitions Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Executing Test Definitions Related Reference Execution Deployment tab Current Run Page

479

Adding Manual Testers


For execution definitions that include manual tests, the Deployment tab enables you to assign users who are to act as manual testers for the selected execution definition. Multiple manual testers can be assigned.

To assign a manual tester to the selected execution definition


1 2 3 4 5 6

Click Execution on the workflow bar. Select the execution definition for which you are assigning a tester. Select the Deployment tab. Click Edit in the Manual Testers area. The Manual Testers dialog box displays. In the Available column, select the User Group Name of which the tester is a member. The available list is populated with all members of the user group. Select the name of the user you want to assign as a manual tester and click Add to add the user to the Selected list; or click Add All to add all of the groups members and testers.

Related Concepts Manual Test Definitions Test Definition Execution Tour of the Manual Testing Client UI Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Executing Test Definitions Related Reference Execution Deployment tab Current Run Page

480

Assigning Keywords to Execution Definitions


To assign keywords to the selected execution definition:
1 2 3 4

Click Execution on the workflow bar. Select an execution definition in the Execution tree. Select the Deployment tab. Click Edit in the Execution Environment portion of the page. The Assign Keywords dialog box displays. All keywords that have been defined for your execution environment are listed here. Note: For automated execution definitions, the default reserved keywords for each execution server (#<execution name>@<location name>) are included in the list.

Select keywords in the Select keywords list that reflect your execution environment requirements. You can use your keyboard's CTRL and SHIFT keys to select multiple keywords using standard browser multi-select functions. Tip: The Select keywords field is auto-complete enabled. When you enter alphanumeric characters into this field, the field is dynamically updated with an existing keyword that matches the entered characters. Note that this field is disabled when multiple keywords are selected in the Select keywords or Assigned Keywords lists. For automated execution definitions, if you only have a few execution servers and do not require hardware provisioning, you can likely get by using only the default, reserved keywords that are created for each execution server. In such cases, it is not necessary that you select additional keywords.

Tip:

Click Add (>) to move the keyword into the Assigned Keywords list. Note: For automated execution definitions, the execution servers that match the assigned keywords are listed below in the dynamically-updated Matching execution servers list. This list updates each time you add or remove a keyword. Click on the name of an execution server in the list to access the execution servers in Administration Locations.

Click OK to save the keywords and close the Assign Keywords dialog box.

481

Related Concepts VMware Lab Manager Virtual Configurations Execution Definitions Test Definition Execution Execution Definition Schedules Related Procedures Managing Test Executions - Quick Start Task Configuring Deployment Environments Executing Test Definitions Creating New Keywords Removing Keywords from Execution Definitions Related Reference Execution Deployment tab

482

Creating New Keywords


To create new keywords:
1 2 3 4

Click Execution on the workflow bar. Select an execution definition in the Execution tree. Select the Deployment tab. Click Edit in the Execution Environment portion of the page. The Assign Keywords dialog box displays. All keywords that have been defined for your execution environment are listed here. Note: For automated execution definitions, the default reserved keywords for each execution server (#<execution name>@<location name>) are included in the list.

On the Assign Keywords dialog box, enter an alphanumeric keyword into the Keyword field that describes the required environment for the execution definition (for example, platform, operating system, and pre-installed applications). The following characters can not be used in keywords: #$?*\,;'" Note: Keywords are case insensitive (for example, Vista and vista are handled as the same keyword).

Press the ENTER key. The new keyword is now available for assignment.

Related Procedures Assigning Keywords to Execution Definitions Removing Keywords from Execution Definitions

483

Removing Keywords from Execution Definitions


To remove execution-definition keyword assignments:
1 2 3 4

Click Execution on the workflow bar. Select an execution definition in the Execution tree. Select the Deployment tab. Click Edit in the Execution Environment portion of the page. The Assign Keywords dialog box displays. All keywords that have been defined for your execution environment are listed here. Note: For automated execution definitions, the default reserved keywords for each execution server (#<execution name>@<location name>) are included in the list.

On the Assign Keywords dialog box, select unneeded keywords in the Assigned keywords field. You can use your keyboard's CTRL and SHIFT keys to select multiple keywords using standard browser multi-select functions. Click Remove (<) to remove the keyword assignments. Click OK to close the Assign Keywords dialog box.

6 7

Note: Keywords that are not in use anymore are automatically deleted from the system. Related Procedures Assigning Keywords to Execution Definitions Creating New Keywords

484

Configuring Execution Dependencies


This section explains how to configure execution dependencies with Test Manager. In This Section Adding Dependent Execution Definitions Describes how to add a dependent execution definition. Deleting a Dependency Describes how to delete a dependency. Editing a Dependency Describes how to edit a dependency.

485

Adding Dependent Execution Definitions


To add a dependent execution definition
1 2 3 4 5

Click Execution on the workflow bar. Select the execution definition that will act as the master execution definition. Select the Dependencies tab. Click Add dependent Execution Definition to display the Add dependent Execution Definition dialog box. From the Condition selection list, select the condition that is to trigger the dependent execution definition (Passed, Failed, Not Executed, or Any). The Any status means that the dependent test execution will trigger no matter what the status of the previous test execution. From the tree menu in the dialog box, select the execution definition that is to be dependent. Select one of the following options to specify where the dependent execution definition is to be deployed:

6 7

As specified in the dependent Execution Definition: Automated test definitions assigned to the dependent
execution definition will be executed on the execution server specified for the dependent execution definition on the Deployment tab. Manual test definitions assigned to the dependent execution definition will be assigned to the users specified for the dependent execution definition on the Deployment tab. dependent execution definition will be executed on the execution server specified for the <selected execution definitions execution server> on the Deployment tab. Manual test definitions assigned to the dependent execution definition will be assigned to the users specified for the <selected execution definitions execution server> on the Deployment tab. tester from the list boxes. Automated test definitions assigned to the dependent execution definition will be executed on the specified execution server. Manual test definitions assigned to the dependent execution definition will be assigned to the specified manual tester. If only a specific manual tester is defined and no server, only manual test definitions will be executed. If only a specific execution server is defined and no manual tester, only automated test definitions will be executed.

Same as <selected execution definitions execution server>: Automated test definitions assigned to the

Specific: Execution Server/Manual Tester: Select a pre-configured execution server and/or a manual

Click OK to create the dependency. Note: Note: Test Manager will not allow you to create cyclical execution dependencies. You can select conditions to fulfill for manual test definitions. (Example: If the selected condition is Failed and all manual tests passed, but some automated tests failed, only automated test definitions assigned to the dependent execution definition will be executed).

486

Related Concepts Execution Dependency Configuration Test Definition Execution Execution Definition Schedules Related Procedures Managing Test Executions - Quick Start Task Executing Test Definitions Related Reference Execution Dependencies tab

487

Deleting a Dependency
To delete a dependency
1 2 3 4 5

Click Execution on the workflow bar. Select the master execution definition from which you want to delete a dependency. Select the Dependencies tab. In the Dependent Execution Definitions area, click the Delete icon in the Actions column. Click Yes on the Delete Dependency dialog box to delete the dependency.

Related Concepts Execution Dependency Configuration Test Definition Execution Execution Definition Schedules Related Procedures Managing Test Executions - Quick Start Task Executing Test Definitions Related Reference Execution Dependencies tab

488

Editing a Dependency
Note: To edit an existing dependency, you must select the master execution definition (the definition for which a specific condition will trigger the execution of another execution definition). You cannot edit dependency settings from an execution definition that is dependent on another execution definition.

To edit a previously configured dependency


1 2 3 4 5

Click Execution on the workflow bar. Select the master execution definition that you are editing. Select the Dependencies tab. In the Dependent Execution Definitions area, click Edit settings in the Actions column to open the Edit Dependency dialog box. Edit the condition that is to trigger the dependent execution and execution server settings.

Related Concepts Execution Dependency Configuration Test Definition Execution Execution Definition Schedules Related Procedures Managing Test Executions - Quick Start Task Adding Dependent Execution Definitions Executing Test Definitions Related Reference Execution Dependencies tab

489

Defining Execution Definition Schedules


This section explains how to schedule execution definitions in Test Manager. In This Section Adding Definite Runs How to add a definite run to a custom schedule. Adding Exclusions How to add an exclusion to a custom schedule. Creating a Custom Schedule for an Execution Definition How to create a custom schedule for an execution definition. Deleting Definite Runs Describes how to delete a definite run. Editing Definite Runs Describes how to edit definite runs. Specifying Global Schedules for Execution Definitions Describes how to specify a global schedule for an execution definition. Specifying No Schedule for Execution Definitions Describes how to specify 'no' schedule for an execution definition. Deleting Exclusions How to delete an exclusion. Editing Exclusions How to edit an exclusion.

490

Adding Definite Runs


Note: You must have administrator rights to edit global schedules. To define a definite run for a global schedule, navigate to Administration Configuration.

To add a definite run to a custom schedule:


1 2 3 4 5 6 7 8

Select Execution on the workflow bar. Select an execution definition for which you want to add a definite run. Select the Schedules tab. Click the Custom option button. Click Add Definite Run. On the Configure Definite Run page, select the date and time when the execution definition should definitely be run. Click OK. Your definite run settings are listed on the Configure Schedule page. Click Save to add the definite run to the current schedule, or continue adding definite runs.

Related Concepts Execution Definition Schedules Test Definition Execution Execution Definitions Related Procedures Managing Test Executions - Quick Start Task Adding Execution Definitions Executing Test Definitions Related Reference Execution Schedule tab

491

Adding Exclusions
Note: You must have administrator rights to edit global schedules. To define a scheduling exclusion for a global schedule, navigate to Administration Configuration.

To add an exclusion to a custom schedule:


1 2 3 4 5 6 7 8 9

Click Execution on the workflow bar. Select an execution definition for which you want to add a scheduling exclusion. Select the Schedules tab. Click the Custom option button. Click Add Exclusion. On the Configure Schedule Exclusion page, select the weekdays on which test definitions should be suppressed. Define the specific time intervals on those days during which execution should be suppressed. Click OK. Your exclusion settings are now listed on the Configure Schedule page. Click Save to add the exclusion to the current schedule, or continue adding additional exclusions.

Related Concepts Execution Definition Schedules Test Definition Execution Execution Definitions Related Procedures Managing Test Executions - Quick Start Task Editing Exclusions Adding Execution Definitions Executing Test Definitions Related Reference Execution Schedule tab

492

Creating a Custom Schedule for an Execution Definition


To create a custom schedule for a selected execution definition:
1 2

Click Execution on the workflow bar. Select an execution definition for which you want to configure a custom schedule. Note: Note: To schedule a folder for execution, select a folder node. To save an edited version of a global schedule as a custom schedule, click Edit while the global schedule is selected in the list box. This enables you to edit the global schedule and save the result as a custom schedule.

3 4 5

Select the Schedule tab. Click the Custom option button to enable the scheduling controls. Click next to the From field and specify when the execution schedule is to begin (Month, Day, Year, Hour, Minute) using the calendar tool. Specify the interval at which the executions tests are to be executed (Day, Hour, Minute). In the Run portion of the GUI, specify when the execution is to end. Select Forever to define a schedule with next to the to field and specify when the execution schedule is to end (Month, Day, no end, or click Year, Hour, Minute) using the calendar tool.

6 7

8 9

(Optional) Click Add Exclusion to define times when scheduled execution definitions should not be executed. Or click Add Definite Run to define times when unscheduled executions should be executed. Click Save to save your custom schedule.

Related Concepts Execution Definition Schedules Test Definition Execution Execution Definitions Related Procedures Managing Test Executions - Quick Start Task Adding Definite Runs Adding Exclusions Adding Execution Definitions Executing Test Definitions Related Reference Execution Schedule tab

493

Deleting Definite Runs


To delete a definite run
1 2 3 4

Click Execution on the workflow bar. Select the execution definition for which you are deleting a previously configured definite run. Select the Schedule tab. In the Actions column, select the Delete icon of the definite run to be deleted.

Related Concepts Execution Definition Schedules Test Definition Execution Execution Definitions Related Procedures Managing Test Executions - Quick Start Task Adding Execution Definitions Executing Test Definitions Related Reference Execution Schedule tab

494

Editing Definite Runs


To edit a definite run
1 2 3 4 5

Click Execution on the workflow bar. Select the execution definition for which you are editing a previously configured definite run. Select the Schedule tab. In the Actions column, select the Edit Definite Run icon of the definite run you have selected to edit. Edit the definite run criteria as required and then click Save.

Related Concepts Execution Definition Schedules Test Definition Execution Execution Definitions Related Procedures Managing Test Executions - Quick Start Task Adding Execution Definitions Executing Test Definitions Related Reference Execution Schedule tab

495

Specifying Global Schedules for Execution Definitions


SilkCentral offers the possibility of defining global schedules, which can be reused in Test Manager for the scheduling of test definitions. Global schedules can speed up the process of scheduling test definitions, since the need to define individual schedules for each test definition is reduced to only those test definitions that require special scheduling.

To select a pre-defined schedule that is globally available throughout SilkCentral


1 2

Click Execution on the workflow bar. Select the execution definition for which you are configuring a schedule. Note: To schedule a folder for execution, select a folder node.

3 4 5

Select the Schedule tab. Click the Global option button. Select the required pre-defined schedule from the Global list box. Details of the pre-defined schedule are displayed in a read-only calendar view. Note: Note: To save an edited version of a global schedule as a custom schedule, click Edit. Global schedules are configured through the Configurations link on the menu tree (Schedule tab).

Related Concepts Execution Definition Schedules Test Definition Execution Execution Definitions Related Procedures Managing Test Executions - Quick Start Task Adding Execution Definitions Executing Test Definitions Related Reference Execution Schedule tab

496

Specifying No Schedule for Execution Definitions


To specify that no schedule should be defined for an execution definition
1 2 3 4

Click Execution on the workflow bar. Select the execution definition for which there is to be no schedule. Select the Schedule tab. Click the None option button.

Related Concepts Execution Definition Schedules Test Definition Execution Execution Definitions Related Procedures Managing Test Executions - Quick Start Task Adding Execution Definitions Executing Test Definitions Related Reference Execution Schedule tab

497

Deleting Exclusions
To delete an exclusion:
1 2 3 4

Click Execution on the workflow bar. Select the execution definition for which you want to delete a previously configured exclusion. Select the Schedule tab. In the Actions column, select the Delete button of the exclusion you want to delete.

Related Concepts Execution Definition Schedules Test Definition Execution Execution Definitions Related Procedures Managing Test Executions - Quick Start Task Adding Exclusions Adding Execution Definitions Executing Test Definitions Related Reference Execution Schedule tab

498

Editing Exclusions
To edit an exclusion:
1 2 3 4 5

Click Execution on the workflow bar. Select the execution definition for which you want to edit a previously configured exclusion. Select the Schedule tab. In the Actions column, select the Edit Exclusion button of the exclusion you want to edit. Edit the exclusion as required and click Save.

Related Concepts Execution Definition Schedules Test Definition Execution Execution Definitions Related Procedures Managing Test Executions - Quick Start Task Adding Exclusions Adding Execution Definitions Executing Test Definitions Related Reference Execution Schedule tab

499

Executing Manual Tests


This section explains how to execute manual tests with Test Manager. In This Section Using the Manual Testing Client This section explains how to use Test Manager's Manual Testing Client. Aborting Manual Test Executions Describes how to abort manual test executions. Executing Manual Tests Describes how to execute manual tests through the Test Manager Web interface. Executing Manual Tests in the Current Run Page Describes how to use the Current Run page to execute a manual test.

500

Using the Manual Testing Client


This section explains how to use Test Manager's Manual Testing Client. In This Section Configuring the Manual Testing Client This section explains how to configure Test Manager's Manual Testing Client. Managing Attachments with the Manual Testing Client This section explains how to manage attachments using Test Manager's Manual Testing Client. Adding an Internal Issue with the Manual Testing Client Map an external issue regarding a test definition into the Manual Testing Client. Changing a Test Definitions Status Describes how to change a test definition's status. Downloading Execution Definition Packages Describes how to download execution definition packages to the Manual Testing Client. Editing Package Build Numbers Describes how to edit package build numbers. Editing Test Definitions Within the Manual Testing Client While in Edit mode, the SilkCentral Manual Testing Client offers a full range of test definition editing functionality, including the addition, reordering, and removal of test steps and the insertion of custom step properties (Test Manager project parameters). Enabling Code Analysis Within the Manual Testing Client How to enable code analysis for an execution definition from within the Manual Testing Client. Executing Manual Tests with the Manual Testing Client Describes how to execute a manual test with the Manual Testing Client. Exporting and Importing Execution Packages Downloaded execution packages can be both exported from and imported to the Manual Testing Client. This enables easy exchange of execution packages between testers over email. Installing SilkCentral Manual Testing Client Describes how to install and start Manual Testing Client. Uploading Test Results to Test Manager Describes how to upload test results to Test Manager. Viewing and Editing Test Definitions in Test Manager Describes how to view a test definition in Test Manager. Working Offline with the Manual Testing Client Describes how to work with the Manual Testing Client in offline mode.

501

Configuring the Manual Testing Client


This section explains how to configure Test Manager's Manual Testing Client. In This Section Configuring Connection Parameters Describes how to configure connection parameters for the Manual Testing Client. Configuring Other Settings Describes how to configure other settings for the Manual Testing Client. Configuring Package Upload Preferences Describes how to configure package upload preferences.

502

Configuring Connection Parameters


Connection parameters are configured automatically the first time you start the Manual Testing Client.

To edit login credentials and validate your connection


1 2 3 4 5 6 7

Navigate to Window

Preferences.

Enter the URL of your Test Manager installation in the Test Manager Server URL field. Select Remember Credentials if you want the Manual Testing Client to insert your login credentials automatically the next time you start the application. Enter your Test Manager Username and Password. Click Validate Connection to test your login settings. Click OK on the confirmation dialog box. Click OK to save your settings.

Related Concepts Manual Test Definitions Test Definition Execution Tour of the Manual Testing Client UI Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Executing Test Definitions

503

Configuring Other Settings


To configure other settings
1 2

Navigate to Window

Preferences.

Select or deselect one or more of the following settings:

Download attachments automatically - Download attached files automatically when execution definition

packages are downloaded from Test Manager. This option must be enabled if you intend to work offline after you download your assigned execution packages. a prompt requesting the build number on which the test was performed.

Ask for build number when completing packages - Before uploading packages to Test Manager, display Show execution dialog always on top - Have the Execute Test dialog box display on top of other open
windows on your computer desktop to facilitate manual testing. When enabled, the Execute Test dialog box stays on top even when another window has the focus. When executing manual tests, you may want to keep the Execute Test dialog box on top so that you can easily enter your test results. If your computer monitor is too small to contain both the Execute Test dialog box and the application under test, you should leave this setting disabled. confirmation prompt before closing the Manual Testing Client.

Ask for uploading workspace to SilkCentral Test Manager before closing main window - Display a
3

Click OK to save your preference settings.

Related Concepts Manual Test Definitions Test Definition Execution Tour of the Manual Testing Client UI Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Executing Test Definitions

504

Configuring Package Upload Preferences


To configure execution-package upload preferences
1 2

Navigate to Window

Preferences.

In the Packages area of the dialog box, check the Remove uploaded packages check box to define an option for automatic deletion of execution definition packages from the Manual Testing Client after packages are uploaded to Test Manager. Select one of the following deletion options:

Immediately After <x> days (enter a value in the days field if you select this option)
4

Check the Upload completed packages immediately check box if you want to have completed test run packages uploaded to Test Manager automatically after test runs are completed.

Related Concepts Manual Test Definitions Test Definition Execution Tour of the Manual Testing Client UI Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Executing Test Definitions

505

Managing Attachments with the Manual Testing Client


This section explains how to manage attachments using Test Manager's Manual Testing Client. In This Section Pasting Screen Captures Describes how to paste screen-captured images using the Manual Testing Client. Uploading Attachments to the Manual Testing Client Describes how to upload an attachment to the Manual Testing Client. Viewing Attached Images Within the Manual Testing Client Describes how to view an attached image while using the Manual Testing Client. Viewing Attachments Within the Manual Testing Client Describes how to view a text definition's attachment from the Manual Testing Client.

506

Pasting Screen Captures


To upload a screen-captured image directly from the clipboard
1 2 3 4

Copy a screen capture to your computers clipboard (the Paste Image button on the Result Files tab of the Manual Testing Client becomes enabled). Click Paste Image. Specify a File Name for the image on the Paste From Clipboard dialog box. Click OK to save the copied screen capture as an image file attachment.

Related Concepts Attachments Manual Test Definitions Test Definition Execution Manual Testing Client Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Executing Test Definitions

507

Uploading Attachments to the Manual Testing Client


When during the course of testing you encounter a relevant result file (screen-captured image, error log, or other file), you can upload the file as an attachment to a test definition.

To upload a result file as an attachment


1 2 3 4

From within the Manual Testing Client, select a test definition in the Test Definitions tab. Click the Result Files tab. Click Add File to browse to and select the result file you want to upload. Click Open to attach the file.

Related Concepts Attachments Manual Test Definitions Test Definition Execution Tour of the Manual Testing Client UI Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Executing Test Definitions

508

Viewing Attached Images Within the Manual Testing Client


To view an attached image file
1 2

From within the Manual Testing Client, select an image file in the Attachments tab. The image displays in the Image Preview field. Use the following viewing tools next to the Image Preview field to manipulate the image:

Show Actual Size Scale to Fit Scale to Fit - Keep Aspect Ratio Open as Detached Window

Related Concepts Attachments Manual Test Definitions Test Definition Execution Tour of the Manual Testing Client UI Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Executing Test Definitions

509

Viewing Attachments Within the Manual Testing Client


To view a test definitions attachments from within the Manual Testing Client
1 2 3

From within the Manual Testing Client, select a test definition in the Test Definitions tab. The Attachments tab lists all of the result files that are associated with the selected test definition. Using the Test Container/Folders and Test Steps check boxes, you can filter the list of attachments to include only those attachments that are related to the selected test container/folder or test step.

Related Concepts Attachments Manual Test Definitions Test Definition Execution Tour of the Manual Testing Client UI Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Executing Test Definitions

510

Adding an Internal Issue with the Manual Testing Client


The manual testing client enables you to create an internal issue and map an external issue to this internal issue. To create the internal issue and map the external issue:
1 2 3 4 5 6

In the Manual Testing Client, select the Inbox tab. Select an execution definition package. The test definitions included in the selected package are listed in the Test Definitions tab. In the Test Definitions tab, double-click a test definition. Click New Internal Issue to open the New Issue dialog box. Fill out the text boxes as described in New Issue Dialog. Click OK.

Note: You must be online during this procedure. Related Concepts Issue Management Manual Test Definitions Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Executing Test Definitions Related Reference Manual Testing Client HTML Support for Description Text Boxes

511

Changing a Test Definitions Status


To change the status of a test definition
1 2

From within the Manual Testing Client, right-click a test definition in the Test Definitions tab. Select an alternative status:

Set as Not Executed Set as Passed Set as Failed Set as Unresolved Set as Unsupported
Note: You can not change the status of test runs that have already been completed. The statuses of execution packages in the Completed Runs tab can not be edited.

Related Concepts Manual Test Definitions Test Definition Execution Tour of the Manual Testing Client UI Calculating the Test Definition Status Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Executing Test Definitions

512

Downloading Execution Definition Packages


After the Manual Testing Client has been configured, the first step in running manual tests is downloading copies of the manual execution definitions that are assigned to you. Note: Execution definitions remain online in Test Manager; only copies of the execution definitions are downloaded to the Manual Testing Client.

To download the execution definition packages that are assigned to you


1 2

From within the Manual Testing Client, click Download on the toolbar. If your connection settings have been correctly configured (and you have execution packages waiting for you), your assigned execution packages will appear in the Inbox.

Related Concepts Manual Test Definitions Test Definition Execution Manual Testing Client Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Executing Test Definitions

513

Editing Package Build Numbers


To edit the build number of an execution package
1 2 3

From within the Manual Testing Client, right-click an execution package in the Inbox. Select Edit Build Number . On the Select Build Number dialog box, select a build number from the Build list box. Note: Note: You can refresh the build list by clicking Refresh build list. If you want to be prompted to specify a build number each time a test run is completed, check the Ask for build number when completing packages check box,

Click OK.

Related Concepts Manual Test Definitions Test Definition Execution Tour of the Manual Testing Client UI Related Procedures Managing Test Executions - Quick Start Task Configuring Package Upload Preferences Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Executing Test Definitions

514

Editing Test Definitions Within the Manual Testing Client


While in Edit mode, the SilkCentral Manual Testing Client offers a full range of test definition editing functionality, including the addition, reordering, and removal of test steps and the insertion of custom step properties (Test Manager project parameters). Note: Editing of data-driven test definitions is currently not supported.

To enable Edit mode


1 2

Double-click an execution package on the Inbox tab to open the Execute Test dialog box. Click Edit. You can now edit the following fields on the Details tab: Planned Time, Step Names, any custom step properties that have been created for your project, Step Description, and Expected Result. Manual test steps can also be added, reordered, and removed on the Details tab. On the Description tab, the following fields can be edited: Test Definition Name and Test Definition Description.

To add a new test step


1 2

While in Edit mode, navigate to Execute Test

Details.

Click Add test step on the toolbar to add a new test step to the end of the test step list. Tip: To insert a new test step into the test step list, select the test step above which the new test step is to appear. Click Insert test step on the toolbar.

To reorder test steps


1 2 3

While in Edit mode, navigate to Execute Test Select a test step that you want to move.

Details.

Click Move Up on the toolbar to move the step up one position in the test step list, or click Move Down to move the step down one position in the list.

To insert parameters (Test Manager project parameters) into description fields


1

While in Edit mode, navigate to Execute Test Description or Execute Test Details. You can select any preconfigured Test Manager project parameters for insertion into the Test Definition Description, Step Description, and Expected Result fields. Place your cursor into one of the text fields. Click Parameters on the far right-end of the toolbar, Select a preconfigured Test Manager project parameter from the list box.

2 3 4

Managing change conflicts upon upload


1 2

After completing your manual test definition edits, click Upload to upload your results to the server. Click Yes to confirm that you want to have your changes committed to the Test Plan tree on the server.

515

If your changes conflict with recent changes made by another user, the Test Definition Conflicts dialog box will display, listing the test definitions that are in conflict. Tip: You can directly access any conflicting test definition in Test Manager to view what was changed by right-clicking the test definition and selecting Go to Test Definition in Test Manager.

Click Upload Changes to ignore changes made by other users and commit your changes to Test Manager (thereby overwriting any recent changes that conflict with your changes). Or click Revert Changes to not have your changes saved to the test definition. If you opt for Revert Changes, your changes will not be committed to the Test Plan tree, however your changes will be visible in the execution results you are uploading. Your changes will not be included in future runs of the test definition.

Related Concepts Manual Test Definitions Test Definition Execution Test Definition Parameters Test Definitions in the Manual Testing Client Manual Testing Client Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Uploading Test Results to Test Manager Executing Test Definitions

516

Enabling Code Analysis Within the Manual Testing Client


To enable code analysis for an execution definition from within the Manual Testing Client
1 2

From within the Manual Testing Client, navigate to Edit

Edit Code Analysis Settings.

On the Edit Code Analysis Settings dialog box, proceed with enabling code analysis for the execution definition. Note: After code analysis is enabled, you can execute your test definitions in the Manual Testing Client. However, you need to click Code Analysis: Start on the Execute Test dialog box before you actually start testing. This way Test Manager will collect code analysis information while you execute the manual test. When you are done testing, click Stop to halt the collection of code analysis information.

Related Concepts Test Manager Code Analysis Manual Test Definitions Code Coverage Analysis Related Procedures Analyzing Test Results - Quick Start Task Enabling Code Analysis for Execution Definitions Executing Manual Tests Analyzing Code Coverage Related Reference Execution Deployment tab Code Analysis Unit Interface

517

Executing Manual Tests with the Manual Testing Client


To execute a manual test with the Manual Testing Client
1 2 3

From within the Manual Testing Client, select the Inbox tab. Select an execution definition package. The test definitions included in the selected package appear in the Test Definitions tab. Click Execute. The Execute Test dialog box of the first test of the selected package opens, at the Details tab. The Details tab enables you to edit the results of each test step as you progress through a test. The following properties for the selected test definition are available: Test Definition Status shows test status (Passed, Failed, Not Executed, Unsupported, or Unresolved). This field can be edited. Planned Time shows estimated time for completion of the test. Used Time tracks elapsed time since the beginning of the test execution. Note: The Used Time field can be edited. Use Suspend/Resume to stop and restart the timer if you need to edit the timer setting (or pause the timer) during a test execution.

The Test Steps portion of the dialog box lists all of the steps that comprise the selected test definition. The following properties are included for each test step: Step Description includes the description that has been defined for the test step. This field can be edited. Note: Test Manager supports HTML formatting and cutting/pasting of HTML content for description fields.

Expected Result is the expected result of each test step (the success condition). Result includes the result of each test step as observed by the tester. Edit this field after you complete each step. Status includes the status of each step. Edit this field after you complete each step.
4

Once you have completed the first test step and edited the fields as required, select and complete any remaining test steps listed in the Test Steps field. Note: Click Next Test to open the next test in the selected execution definition (this button is displayed only if multiple test definitions exist), or click Previous Test to open the previous test in the execution definition.

5 6

Click Go to Issues to enter an issue (bug) for the selected manual test definition in Test Manager. When you have completed all steps in the test, click Finish Run to close the Execute Test dialog box.

To finish a test package before all test definitions have been executed:
1 2

If you attempt to complete testing of a test package by clicking Finish while any of the package's test definitions have a status of Not Executed, the Finish Run dialog box will display. Select a value from the list box to specify how the unexecuted test definitions should be handled:

Remove test definitions from this run (results of unexecuted tests will be removed from the package's
results)

Set status to passed Set status to failed Set status to unresolved

518

Set status to unsupported


3

Click OK.

Related Concepts Issue Management Manual Test Definitions Test Definition Execution Manual Testing Client Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Executing Test Definitions Related Reference Execute Test Dialog Box HTML Support for Description Text Boxes

519

Exporting and Importing Execution Packages


Downloaded execution packages can be both exported from and imported to the Manual Testing Client. This allows for easy exchange of execution packages between testers over email. Downloaded attachments are automatically included in exported packages. Execution packages carry the .zpkg extension.

To export an execution package


1 2 3 4

Right-click an execution package in the Manual Testing Client's Inbox. Select Export Package. On the Export to dialog box, browse to the location where the package (.zpkg file) is to be saved and click Save. Click OK on the confirmation dialog box notifying you that the export was successful.

To import an execution package


1 2 3

Navigate to File

Import Package

On the Import from dialog box, browse to the package (.zpkg file) that is to be imported and click Open. Click OK on the confirmation dialog box notifying you that the import was successful.

Related Concepts Manual Test Definitions Test Definition Execution Test Definitions in the Manual Testing Client Manual Testing Client Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Executing Test Definitions

520

Installing SilkCentral Manual Testing Client


The Manual Testing Client is a separate executable from Test Manager. Using Java Web Start technology, Manual Testing Client can be deployed with a single click over the network. Java Web Start ensures the most current version of Manual Testing Client will be deployed, as well as the correct version of the Java Runtime Environment (JRE). Prerequisite: Java Runtime Environment (JRE) version 1.5 or higher must be installed on your computer to use Manual Testing Client with Java Web Start technology. You can download JRE from java.sun.com . Alternately you can install Manual Testing Client on your computer by navigating to Help Client in SilkCentral Test Manager. Tools Manual Testing

Tip: To start SilkCentral Manual Testing Client if it is already installed, navigate to Start Programs Borland SilkCentral Test Manager Manual Testing Client.

To install and start SilkCentral Manual Testing Client


1 2

When clicking a link to the Manual Testing Client Web Start URL (http://<Test Manager host>/ webstart/mtc/), for example in a manual testing notification email, a File Download dialog box opens. Click Open. If Manual Testing Client is not installed on your computer, the Java Web Start dialog box opens and immediately starts downloading Manual Testing Client. The download can take up to several minutes. If Manual Testing Client is already installed on your computer, Manual Testing Client opens and the following steps are not applicable.

3 4

When the download has completed, a Warning Security dialog box opens, asking you if you want to run the digitally signed application. Check the Always trust content from this publisher check box, then click Run. Manual Testing Client opens.

To uninstall SilkCentral Manual Testing Client


1 2 3 4 5

Start a Windows command line session (Start

Run, type cmd in the Open field, then click OK).

In the Windows command line window, type javaws viewer and press ENTER. The Java Cache Viewer opens. In the Show list box, select Applications, if not already selected. The table should now list Manual Testing Client Web Start. Select this application and click the delete button (X) on the toolbar. Manual Testing Client is now removed from your computer and you can close the Java Cache Viewer and Java Control Panel dialog boxes.

521

Related Concepts Manual Test Definitions Test Definition Execution Manual Testing Client Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Executing Test Definitions

522

Uploading Test Results to Test Manager


Use these procedures to upload test results to Test Manager.

To upload test results from finished packages:


1

From within the Manual Testing Client, complete a manual test by clicking Finish Run on the Execute Test dialog box or Finish on the toolbar. Note that if you attempt to complete testing of a test package while any of the package's test definitions have a status of Not Executed, the Finish Run dialog box will display on which you can define how unexecuted test definitions should be handled.

2 3

Select the Completed Runs tab. Right-click a completed test run and select Upload to Test Manager, or select Upload from the File menu, to upload your test results to Test Manager. Note: Execution definition statuses are updated automatically in the SilkCentral database when you are working online.

To store test results:


1 2

Select Store to SilkCentral from the File menu, to store your test results to Test Manager. Alternatively, when closing Manual Testing Client, you can upload your entire workspace to Test Manager.

Related Concepts Manual Test Definitions Test Definition Execution Manual Testing Client Related Procedures Managing Test Executions - Quick Start Task Configuring Package Upload Preferences Executing Manual Tests Using the Manual Testing Client Executing Manual Tests with the Manual Testing Client Working with Manual Tests Executing Test Definitions

523

Viewing and Editing Test Definitions in Test Manager


To view a test definition in Test Manager
1 2 3

From within the Manual Testing Client, right-click a test definition on the Test Definitions tab. Select Go to Test Definition in Test Manager. If prompted, enter your Test Manager login credentials. You will be directed to Test Managers Test Plan unit, where the corresponding test definition will be selected in the Test Plan tree.

Related Concepts Manual Test Definitions Test Definition Execution Manual Testing Client Related Procedures Managing Test Executions - Quick Start Task Configuring Package Upload Preferences Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Executing Test Definitions

524

Working Offline with the Manual Testing Client


If you plan to work without access to an Internet connection, select offline work mode so that the Manual Testing Client will not attempt to connect to Test Manager automatically.

To work in offline mode


1 2

From within the Manual Testing Client, select File

Work Offline or click Online in the bottom right corner.

Once you have completed your tests and have access to an Internet connection, proceed with uploading your test results.

Related Concepts Manual Test Definitions Test Definition Execution Tour of the Manual Testing Client UI Related Procedures Managing Test Executions - Quick Start Task Uploading Test Results to Test Manager Configuring Package Upload Preferences Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Executing Test Definitions

525

Aborting Manual Test Executions


Tip: The Manual Testing Client, an Eclipse-based client tool, is the recommended tool for executing manual tests with Test Manager.

To abort a manual test execution


1 2 3

Navigate to Test Manager

Execution.

Click Continue Manual Test on the toolbar. The Manual Tests in Progress dialog box displays a list of all pending manual tests. Click Finish as not executed in the Action column of the manual test definition you want to remove. You can also abort all pending manual test definitions by clicking Remove All Tests.

Related Concepts Manual Test Definitions Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Working with Manual Tests Executing Test Definitions Related Reference Current Run Page

526

Executing Manual Tests


Tip: The Manual Testing Client, an Eclipse-based client tool, is the recommended tool for executing manual tests with Test Manager.

To execute a manual test using the Test Manager Web interface


1 2 3 4

Navigate to Test Manager Click Run on the toolbar.

Execution.

Select the manual execution definition that you intend to execute. The Run dialog box displays. Define which test definitions you want to execute and click OK. Tip: Note: To go directly to the Current Run page, uncheck Go to Activities page. When you choose the Run command on a manual test execution node, you must perform the manual test yourself. When you choose the Run command on a folder however, all included manual tests within the folder must be executed by the testers who have been assigned to the folder on the Deployment page, not the testers who have been assigned to the individual execution definitions. Unless you have an automated test definition incorporated into the selected execution definition, you will be presented with a dialog informing you that No execution server has been specified for this execution definition. Manual tests do not use execution servers, so you can ignore this message and close the dialog.

Note:

If there are already pending manual tests in the selected execution definition, the Manual Tests In Progress dialog box displays.

Click Start New to create a new execution of the manual tests. Click Remove All Tests to finish all pending manual tests and set their status to Not Executed. Select a pending execution and click Continue to continue the selected execution.
6

On the Current Run page, proceed with the manual test execution.

527

Related Concepts Manual Test Definitions Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Aborting Manual Test Executions Working with Manual Tests Executing Test Definitions Related Reference Current Run Page Execution Deployment tab Run Dialog

528

Executing Manual Tests in the Current Run Page


The current runs page allows easily finishing a manual test run. All required information is displayed in the two grid views, and the status of each test and test step can be changed with two clicks. You can see if someone is working on the test, in which case the tests status is In Progress, who is working on the test, and which test steps are already finished. Tip: Borland recommends to use the Manual Testing Client, an Eclipse-based client tool, for executing manual tests with Test Manager.

To execute a manual test using the current run page:


1 2 3 4

Navigate to Test Manager Click Run on the toolbar.

Execution.

Select the execution definition with the assigned manual test that you want to execute. The Run dialog box displays. Define which test definitions you want to execute and click OK. If the selected test is already in progress, a new test run starts. Click Cancel to close the Manual Tests In Progress dialog box. The Current Run page opens. You are provided with detailed information on every test step. Click on the Status of a test step and change it to the appropriate status. Repeat the previous step for all test steps. Optional: Use your keyboard's CTRL and SHIFT keys to select multiple test steps using standard browser multiselect functions. Right click on your selection and set the status of the selected test steps to the selected status. Optional: Click Finish Run to finish a run without finishing every test definition. The Finish Run dialog box opens. Choose the appropriate build and the action to perform on the unfinished test definitions. If one test step fails, the whole test is marked as failed.

5 6 7 8 9

10 The Status of the manual test is changed to the cummulated status of the test steps when the test run finishes.

Related Concepts Manual Test Definitions Test Definition Execution Calculating the Test Definition Status Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Aborting Manual Test Executions Working with Manual Tests Executing Test Definitions Related Reference Run Dialog Current Run Page

529

Running Automated Tests


This section explains how to run automated tests with Test Manager. You can either run a test immediately (in which case the execution definition is queued on the specified execution server); you can re-run any test definitions that failed during the most recently executed test; or you can proceed with manual testing. Changes that are made to test-plan properties are automatically deployed to execution servers and used for executions. The Test Plan unit has been designed such that you can perform modifications and extensions to test plans without concern that properties are valid and consistent for execution. In This Section Executing Individual Tests Describes how to execute individual execution definitions independent of a schedule.

530

Executing Individual Tests


To run an execution definition independent of a schedule
1 2 3 4 5

Click Execution on the workflow bar. Select the execution definition that is to be run. Click Run on the toolbar. The Run dialog box displays. Define which test definitions you want to execute. The execution definition is then queued on the specified execution server. Test definitions are executed in the order in which they are listed on the Assigned Test Definitions tab (Execution View). Details of executions can be viewed in the Projects unit, Activities tab. Note: If the execution definition contains manual tests that are still in progress, you will be presented with a list of these tests.

If the execution definition does not contain pending manual tests, the Go To Activities dialog box displays. Click Yes to view the Activities page, or click No if you want to remain on the current Web page. Note: Check the Don't show this dialog again (during this login session) check box if you do not want to be asked about switching to the Activities page again in the future. This setting will be discarded when you log out of Test Manager.

Related Concepts Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Updating Execution Definitions Assigning Keywords to Execution Definitions SilkTest Tests Working with Manual Tests Executing Test Definitions Related Reference Execution Assigned Test Definitions Tab Activities Page Run Dialog

531

Working with Execution Definitions


This section explains how to work with execution definitions in Test Manager. In This Section Adding Execution Definitions Describes how to add an execution definition. Copying Execution Definitions Describes how to copy and paste an execution definition. Deleting Execution Definitions Describes how to delete an execution definition. Editing Execution Definitions Describes how to edit an execution definition.

532

Adding Execution Definitions


To add an execution definition
1 2 3 4

Click Execution on the workflow bar. Select an existing folder in the Execution tree, or select the project node. Click New Execution Definition on the toolbar (or right-click within the Execution tree and choose New Child Execution Definition ). The New Execution Definition dialog box displays. Enter a name and meaningful description for the execution definition. Note: Test Manager supports HTML formatting and cutting and pasting of HTML content for Description fields.

Select a test container from the Test Container list box. The Version and Build that are associated with the product that the container is associated with are then populated automatically in the Version and Build fields. You may only associate one test container to a test execution. Select a product Version and Build from the list boxes. If a build information file is available on the execution server, you have the option to check the Read from Build Information file check box, in which case build and information will be read from the build information file for the test run, overriding any manual settings that have been selected on the New Execution Definition dialog box. Specify a Priority for the execution definition from the list box (Low, Normal, or High). In the Source Control Label field you can optionally specify that the execution definition be of an earlier version than the latest version. Click OK to update the Execution tree with the newly created execution definition.

7 8 9

Related Concepts Test Definition Execution Execution Definition Schedules Build Information Related Procedures Managing Test Executions - Quick Start Task Execution Definitions Working with Execution Definitions Executing Test Definitions Creating an Execution Definition in Grid View Related Reference Execution Unit Interface HTML Support for Description Text Boxes

533

Copying Execution Definitions


To copy and paste an execution definition:
1 2 3 4 5

Click Execution on the workflow bar. Select an execution definition in the Execution tree. Click Copy on the toolbar (or right-click the execution-definition node and select Copy ). Select the target folder where the execution definition is to be pasted. Click Paste on the toolbar (or right-click the execution-definition node and select Paste ). The Execution tree is updated with a copy of the pasted execution definition. All assigned test definitions, filters, and scheduling parameters are copied along with the execution definition.

Related Concepts Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Execution Definitions Working with Execution Definitions Executing Test Definitions Related Reference Execution Unit Interface

534

Deleting Execution Definitions


To delete an execution definition
1 2 3 4

Click Execution on the workflow bar. Select an execution definition in the Execution tree. Click Delete on the toolbar (or right-click the execution-definition node and select Delete ). Click Yes on the deletion confirmation dialog to remove the execution definition from the Execution tree.

When deleting an execution definition, the run results of assigned test definititions are also deleted. The test definition run results may still appear in reports, because they are stored in the database, which is not immediately updated after the deletion of the execution definition. Related Concepts Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Execution Definitions Working with Execution Definitions Executing Test Definitions Related Reference Execution Unit Interface

535

Editing Execution Definitions


To edit an existing execution definition
1 2 3 4

Click Execution on the workflow bar. Select the node of the execution definition you are editing. Click Edit on the toolbar (or right-click the execution-definition node and select Edit ). The Edit Execution Definition dialog box displays. Edit the execution definition by modifying the criteria, such as the description and values, defined in the Edit Execution Definition dialog box. If there are no runs and no test definitions assigned to the execution definition, you can choose an alternative test container for the execution definition from the Test Container list box. Click OK to save the edited execution definition.

Related Concepts Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Execution Definitions Working with Execution Definitions Executing Test Definitions Related Reference Execution Unit Interface

536

Working with SilkPerformer Projects


This section explains how to work with SilkPerformer projects within Test Manager. In This Section Analyzing SilkPerformer Test Results Describes how to analyze SilkPerformer Test Results. Downloading SilkPerformer Test Result Packages Describes how to download SilkPerformer Test Result Packages. Downloading SilkPerformer Projects Describes how to download a SilkPerformer project. Editing SilkPerformer Test Properties Describes how to edit SilkPerformer Test Properties. Executing Attended SilkPerformer Tests Describes how to execute an Attended SilkPerformer Test. Opening SilkPerformer Projects Descrbies how to open a SilkPerformer Project from Test Manager. Uploading SilkPerformer Test Results Describes how to upload SilkPerformer Test Results.

537

Analyzing SilkPerformer Test Results


Performance Manager enables in-depth analysis of SilkPerformer test results. The Analyze Results option downloads only selected results, in contrast to Downloading result packages. To assist you in analyzing the results of your optimization efforts, Performance Explorer even allows you to compare statistics from multiple test runs sideby-side in cross load-test reports. The results of tests that are run using Test Manager can be automatically loaded into Performance Manager through commands on the Runs tab in the Test Plan unit. See Performance Manager documentation for full details regarding use of Performance Manager and its integration with Test Manager.

To open SilkPerformer test results in Performance Manager


1 2 3 4 5

Select Test Plan on the workflow bar. Select the test definition you are interested in viewing. Select the Runs tab. Click the Analyze Results icon of the test execution for which you want to download results. A File Download dialog box displays, showing you the name of the Performance Manager command (.sppecmd) file that you are about to download. Click Open to open the results in Performance Manager (alternatively, you can click Save to save the results locally). If not already open in the background, Performance Manager now opens, connected directly to your Test Manager installation, and fetches the results of the selected execution run. Note: To prepare for a cross load-test report that compares the results of multiple executions in a single report, you may download the results of additional executions from the Runs tab. Additional execution results are displayed in the existing instance of Performance Manager on the Performance Manager Test Manager tab. See Performance Manager documentation for more details regarding cross load-test reports.

Related Concepts Working With SilkPerformer Projects Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Working with SilkPerformer Projects Executing Test Definitions Related Reference Execution Unit Interface

538

Downloading SilkPerformer Test Result Packages


Downloading result packages is the ideal option if you want to analyze the complete results set of a test run, or if you want to download the complete results set for offline analysis. Because result packages often include large TrueLog On Error files, result packages can be compressed and downloaded to your local hard drive as .lrz files. Downloading results locally can also be useful when you are working from a slow Internet connection.

To download SilkPerformer test results


1 2 3 4 5

Click Test Plan on the workflow bar. Select a SilkPerformer test definition. Select the Runs tab. Click the Download Results icon of the test execution for which you want to download results. A File Download dialog box displays, showing you the name of the compressed results package (.ltz) file that you are about to download. Click Open to open the results in Performance Manager (alternatively, you can click Save to save the results locally). If not already open in the background, Performance Manager now opens. You are presented with an Import Project dialog box that indicates the target directory to which the results will be saved. Click OK to accept the default path, or click Browse to select an alternate path. The downloaded results are then displayed in Performance Manager. Note: If you accept the default Projects directory where result packages are typically stored (generally recommended), then the results will be stored with all otherSilkPerformer results and will be readily accessible through the Performance Manager Add Loadtest Results command.

Related Concepts Working With SilkPerformer Projects Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Working with SilkPerformer Projects Executing Test Definitions Related Reference Execution Unit Interface

539

Downloading SilkPerformer Projects


Whereas opening a SilkPerformer project may involve checking out a SilkPerformer project from a source-control tool, editing the project in SilkPerformer, and checking the project back into Test Manager, downloading a project involves downloading a copy of a project and working with it independently of Test Manager. Changes you make to downloaded projects are not automatically migrated back to Test Manager.

To download a SilkPerformer project


1 2 3 4 5

Click Test Plan on the workflow bar. Select a SilkPerformer test definition in the Test Plan tree view. On the Properties tab, scroll down to the SilkPerformer Test Properties section. Click the Download SilkPerformer Project icon. A file download dialog box displays, asking you to confirm that you wish to download the specified SilkPerformer project to your local system. Click Save to open the file in SilkPerformer. If not already open in the background, SilkPerformer will be invoked. The Select Target Directory dialog box displays, loaded with the default directory path to which the specified SilkPerformer project will be saved. If you approve of the specified pathname, click OK, otherwise click Browse to specify an alternate path. Note: Even if you have configured source-control integration you will not be prompted to check out the SilkPerformer project from your source-control system because you are working with this file independently of Test Manager. SilkPerformer projects utilized by Test Manager can also be downloaded directly from the SilkPerformer user interface. See SilkPerformer documentation for details.

Note:

Related Concepts Working With SilkPerformer Projects Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Working with SilkPerformer Projects Executing Test Definitions Related Reference Execution Unit Interface

540

Editing SilkPerformer Test Properties


Note: To use Test Manager's data-driven test functionality with SilkPerformer scripts, data sources with column names matching the corresponding SilkPerformer project attributes must be used in conjunction with AttributeGet methods.

To edit SilkPerformer test properties


1 2 3 4 5 6

Click Test Plan on the workflow bar. Select a SilkPerformer test definition. Select the Properties tab. Scroll down to the SilkPerformer Test Properties section. Click Edit SilkPerformer Test Properties. Proceed with the configuration of your SilkPerformer Test.

Related Concepts Working With SilkPerformer Projects Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Working with SilkPerformer Projects Executing Test Definitions Related Reference Execution Unit Interface

541

Executing Attended SilkPerformer Tests


Attended tests are SilkPerformer tests that are executed manually in SilkPerformer and are not executed automatically based on a pre-defined schedule in Test Manager. Note: To use Test Manager's data-driven test functionality with SilkPerformer scripts, data sources with column names matching the corresponding SilkPerformer project, attributes must be used in conjunction with AttributeGet methods.

To execute an attended test run in SilkPerformer


1 2 3 4 5 6

Click Test Plan on the workflow bar. Select a test definition. Select the Properties tab. Scroll down to the SilkPerformer Test Properties section. Click Run Attended Test. A File Download dialog box displays, asking you to confirm that you wish to run the specified SilkPerformer command file (.spwbcmd). Click Open to open the project in SilkPerformer. If not already open in the background, SilkPerformer will be invoked. The Select Target Directory dialog box opens, loaded with the default directory path to which the specified SilkPerformer project will be saved. If you approve of the specified pathname, click OK, otherwise click Browse to specify an alternate path. The SilkPerformer Workload Configuration dialog box opens with all of the workload settings that are associated with the SilkPerformer project. Edit the workload settings as required and click Run to begin the test and monitor the test results with SilkPerformer. Note: Clicking Run without editing any workload settings executes the SilkPerformer test in exactly the same way as if the test had been executed directly from Test Manager as an unattended test.

Related Concepts Working With SilkPerformer Projects Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Working with SilkPerformer Projects Executing Test Definitions Related Reference Execution Unit Interface

542

Opening SilkPerformer Projects


To open a SilkPerformer project from Test Manager
1 2 3 4 5 6

Select Test Plan on the workflow bar. Select a SilkPerformer test definition. Select the Properties tab. Scroll down to the SilkPerformer Test Properties section. Click the Open SilkPerformer Project icon. A file download dialog box displays, asking you to confirm that you wish to open the specified SilkPerformer command file (.spwbcmd) in SilkPerformer. Click Open to open the file in SilkPerformer. If not already open in the background, SilkPerformer will be invoked. The Select Target Directory dialog opens, loaded with the default directory path to which the specified SilkPerformer project will be saved. If you approve of the specifed pathname, Click OK, otherwise click Browse to specify an alternate path. If you have configured source-control integration for Test Manager (for example, Visual SourceSafe) you will now be presented with a login screen for your source-control client. Enter valid user connection settings and click OK to continue. Note: SilkPerformer projects utilized by Test Manager can also be opened directly from SilkPerformer. See SilkPerformer documentation for details.

Related Concepts Working With SilkPerformer Projects Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Working with SilkPerformer Projects Executing Test Definitions Related Reference Execution Unit Interface

543

Uploading SilkPerformer Test Results


Once you have completed running an attended test in SilkPerformer, you can upload the test results to Test Manager and associate the results with a test definition.

To upload results from an attended SilkPerformer test


1 2 3

Run an attended SilkPerformer test. When the test is complete, select the Upload Results to Test Manager command from the Results menu. The Login screen of the Upload Results to Test Manager wizard displays. Enter your password and click Next. Note: Because this is an attended test, the wizard already knows the appropriate hostname and username of the test definition to which these results are to be uploaded.

4 5

If not already selected by default in the project list, select the SilkCentral Test Manager project to which you want to upload the SilkPerformer results. If not already selected by default in the tree list, select the test definition to which you want to upload the results. Click Next. Note: You can right-click in the tree and use the commands on the context menu to create a new test definition, child test definition, test folder, and/or child test folder to which the results can be saved.

On the subsequent screen you can specify Version and Build numbers for the assigned Product to which the uploaded results belong. Also specify the SilkPerformer Test result status (e.g., Passed, Failed). Note: If any errors occured during the test run, the Test Result status will be set to Failed by default.

Click Finish to upload the results. Uploaded results appear in Test Manager on the Runs tab (Test Plan unit) in the Test Definition Runs column.

Related Concepts Working With SilkPerformer Projects Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Executing Attended SilkPerformer Tests Working with SilkPerformer Projects Executing Test Definitions Related Reference Execution Unit Interface

544

Collapsing or Expanding the Execution Tree


You can consolidate levels of the hierarchy or display all levels of the hierarchy based on your viewing needs.

To collapse or expand the Execution tree


1 2

Click Execution on the workflow bar. Right-click a node within the Execution tree and select a collapse or expand option.

Related Concepts Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Executing Test Definitions Related Reference Execution Unit Interface

545

Configuring Setup and Cleanup Executions


To define a test definition as a setup or cleanup test definition
1 2 3

Click Execution on the workflow bar. Click the execution definition for which you are assigning a setup or cleanup test definition. Click the Setup/Cleanup tab.

To define a setup test definition, proceed with the following step. To define a cleanup test definition, proceed with step 7.
4 5 6 7 8 9

Click Edit in the Setup Test Definition portion of the tab. The Edit Setup Test Definition dialog box displays. Browse through your projects test planning tree and select the test definition that is to serve as this execution definitions setup test definition. Click OK. The assigned test definition then displays in the Setup Test Definition list. Click Edit in the Cleanup Test Definition portion of the tab. The Edit Cleanup Test Definition dialog box displays. Browse through your projects test planning tree and select the test definition that is to serve as this execution definitions cleanup test definition. Click OK. The assigned test definition now displays in the Cleanup Test Definition list.

Related Concepts Setup and Cleanup Test Definitions Execution Definitions Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Executing Test Definitions Related Reference Execution Setup/Cleanup tab

546

Creating Data-Driven Execution Definitions


Note: For details on configuring data sources for data-driven tests, see SilkCentral Administration Module documentation for details. For details on configuring test definitions for data-driven tests, see Working with Data-Driven Tests .

To create an execution definition for a data-driven test


1 2

Click Execution on the workflow bar. Create an execution definition using a data-driven test definition. Note: When a test definition is specified as having each data row as a single test definition, the execution definition includes a separate test definition for each data row. To create an execution definition with only a selection of data-driven test definitions, you need to assign test definitions with the filter option. See the related concept for details.

To view execution activities for data-driven tests


1 2 3 4

In the Execution unit, select an execution definition that is based on a data-driven test definition. Click Activities in the workflow bar. Click the Run ID of the relevant execution definition. In the Assigned Test Definitions table, click the name of a data-driven test definition. Note: If you are running a multiple data-driven test, you will see one test definition for each data row in your data source.

The results page for that particular test definition opens. Select the Data Driven tab. Here you can view all instances of the test definition that were executed. Note: The test definitions data-driven properties are listed on the Details tab in the Data Driven Properties table.

Click an instance name to view test-definition run details for that specific instance. Note: If you are working with multiple data-driven test definition instances, a separate instance will be created for each data row in your data source.

Click the Parameters tab to view the data source values that were used during this specific test run.

547

Related Concepts Data-Driven Tests Execution Definitions Test Definition Execution SilkTest Test Definitions Automated Execution of SilkTest Test Definitions Related Procedures Managing Test Executions - Quick Start Task Creating Test Definitions Working with Data-Driven Tests Executing Test Definitions Related Reference Execution Unit Interface Execution Assigned Test Definitions Tab

548

Managing Issues
This section explains how to manage issues with SilkCentral Issue Manager. In This Section Tracking Issues This section explains how to track issues with SilkCentral Issue Manager. Working with Issues This section explains how to work with issues in Test Manager.

549

Tracking Issues
This section explains how to track issues with SilkCentral Issue Manager. In This Section Viewing Issue Statistics in Details View Describes how to view issue statistics in the Details View. Viewing Issue Statistics in Document View Describes how to view issue statistics in Document View.

550

Viewing Issue Statistics in Details View


To view issue statistics in Details View
1 2 3 4 5 6

Click Issues on the navigation tree. Click Details View on the toolbar. Select the tree node (project, issue-tracking system, or product) for which you want to view statistics. The calendar feature enables you to specify the time period over which you want to view issue statistics. Click the time-frame dates link to expand the calendar. Using the calendars From and To list boxes, specify start and end times for issue statistics. Click Update to update the chart view based on the specified time range.

Related Concepts Issue Management SilkCentral Issue Manager Test Definition Execution Upload Manager Related Procedures Managing Test Executions - Quick Start Task Tracking Issues Working with Issues Executing Test Definitions Related Reference Issues Unit Interface Execution Unit Interface Calendar Tool

551

Viewing Issue Statistics in Document View


To view issue statistics in Document View
1 2 3

Click the Issues link on the menu tree. Click Document on the toolbar. Select the tree node (project, issue-tracking system, or product) for which you want to view statistics.

Related Concepts Issue Management SilkCentral Issue Manager Test Definition Execution Upload Manager Related Procedures Managing Test Executions - Quick Start Task Tracking Issues Working with Issues Executing Test Definitions Related Reference Issues Unit Interface Execution Unit Interface

552

Working with Issues


This section explains how to work with issues in Test Manager. In This Section Assigning External Issues Describes how to assign an existing external issue to a test definition. Creating New Issues Describes how to create a new issue. Deleting Issues Describes how to delete an issue. Specifying a Calendar Range Select a calendar range to view issues in a certain time frame. Synchronizing Internal/External Issue States Describes how to synchronize issue states between Test Manager and an external issue tracking-system.

553

Assigning External Issues


The Issues tab enables you to assign issues in externally configured issue tracking systems to a selected test definition.

To assign an existing external issue to a test definition


1 2 3 4 5 6 7 8

Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select the node of the test definition for which you want to assign an external issue. Select the Issues tab. Click Assign External Issue to open the Assign External Issue dialog box. Select the profile of the pre-configured, external issue-tracking system where the issue is tracked. In the External ID field, manually enter the unique alpha-numeric ID of an existing issue in the external issuetracking system. Click OK.

Related Concepts Issue Management Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Working with Issues Managing Test Plans Related Reference Test Plan Issues Page Issues Unit Interface

554

Creating New Issues


The Issues tab enables you to easily enter issues related to the selected test definition.

To create a new issue


1

Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select the node of the test definition for which you want to create a new issue. Select the Issues tab. Click New Issue to open the New Issue dialog box. Select the profile of the issue-tracking system you are submitting the issue to. The Profile list box shows the internal (always available) and any external issue-tracking profiles that you may have created. Select Internal to save the issue only to the Test Manager database. Select an external profile to have the new issue saved to both the external tool and Test Manager. Note: Note: The profile you select here becomes the default selection for when you enter new issues in the future. When adding a new issue to an external tracking system, you will be prompted to provide login credentials for the external system. The credentials that you provide will be automatically preselected for you in the future. If you do not provide credentials, the default credentials stored in the profile will be used.

2 3 4

Enter a brief Synopsis of the issue. Enter a meaningful Description of the issue. Specify the Status of the issue (Open, Fixed, Verified, Deferred, Closed). When using an external profile, status is set by the external tool. Specify the ExternID of the corresponding issue in the external issue-tracking profile. Note: The ExternID is the corresponding issue ID in the external tool. This option is disabled when you have specified an external issue-tracking profile because the external tool sets this value. When the Internal profile is selected, this value can be set manually.

6 7

Specify the ExternLink of the issue-tracking profile. Note: The ExternLink is the HTTP link to the issue in the external tool. This option is disabled when you have specified an external issue-tracking profile because the external tool sets this value (when the tool offers direct HTTP links to issues, as is the case with Issue Manager). When a link is specified, the ExternID is shown as a link in the issue list. Depending on the issue-tracking profile you are working with, the New Issue dialog box may include other tracking fields that are specific to the external issue-tracking tool.

Note:
9

Click Save to save the new issue. Note: Issue Manager determines the ID numbers of newly created issues.

555

Related Concepts Issue Management Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Working with Issues Managing Test Plans Related Reference Test Plan Issues Page Issues Unit Interface

556

Deleting Issues
To delete an issue from the issue-tracking system
1

Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select the node of the test definition for which you want to delete an issue. Select the Issues tab. Click the delete icon of the issue you want to delete. Click Yes on the Delete Issue dialog box to confirm the deletion. External issues are not affected when internal issues are deleted.

2 3 4 5

Related Concepts Issue Management Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Working with Issues Managing Test Plans Related Reference Test Plan Issues Page

557

Specifying a Calendar Range


To view issues within a specific calendar time frame:
1 2 3 4 5 6

Click Issues on the navigation tree. Select an issue in the menu tree. Select the Details view. Click in the top-left corner of the tab view to open the calendar. .

Specify the From and To date/time for which you want to view issues by clicking the respective

Click Update to refresh the tab view with the issue listings that fall within the time frame you have specified.

Related Reference Calendar Tool

558

Synchronizing Internal/External Issue States


To synchronize issue states between Test Manager and an external issue tracking-system:
1 2 3 4 5

Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select the node of the test definition for which you are updating a corresponding external issue. Select the Issues tab. Click Update states of external Issues to synchronize the state of the issues listed in Test Manager with the corresponding issues in the external tool.

Related Concepts Issue Management Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Working with Issues Managing Test Plans Related Reference Test Plan Issues Page Issues Unit Interface

559

Managing Projects
This section explains how to manage projects in Test Manager. In This Section Managing Folders This section explains how to manage folders in Test Manager. Creating Build Information Files How to create a dedicated file that contains appropriate build information for a Test Manager project. Selecting Projects How to select projects in Test Manager.

560

Managing Folders
This section explains how to manage folders in Test Manager. In This Section Copying Folders How to copy a folder. Cutting Folders How to cut a folder. Deleting Folders How to delete a folder. Editing Folders How to edit a folder. Pasting Folders How to paste folders. Pasting Folders as Child Folders How to paste folders as child folders. Sorting Folders How to sort folders. Adding Folders How to add a new folder.

561

Copying Folders
To copy a folder:
1

To copy a test executions folder, click Execution on the workflow bar. To copy a reports folder, click Reports on the workflow bar. Select a folder in the Reports/Execution tree. Click Copy on the toolbar to add a copy of the folder and its contents to the clipboard.

2 3

Related Concepts Successful Test Management Related Procedures Managing Folders Managing Projects Related Reference Projects Unit Interface

562

Cutting Folders
Cutting a folder differs from deleting a folder in that the folder and its contents are saved to the clipboard for subsequent pasting.

To cut a folder:
1

To cut a test executions folder, click Execution on the workflow bar. To cut a reports folder, click Reports on the workflow bar. Select a folder in the Reports/Execution tree. Click Cut on the toolbar to move the folder and its contents to the clipboard. Note: When you copy a folder, all sub-folders and reports contained within the folder are displayed in blue italics. Elements remain in this state until you select a new location and click Paste on the toolbar (or until you right-click an element within the cut group and select Undo Cut ).

2 3

Related Concepts Successful Test Management Related Procedures Managing Folders Managing Projects Related Reference Projects Unit Interface

563

Deleting Folders
To delete a folder:
1

To delete a test executions folder, click Execution on the workflow bar. To delete a reports folder, click Reports on the workflow bar. Select a folder in the Reports/Execution tree. Click Delete on the toolbar. On the confirmation dialog box, click OK to permanently delete the folder.

2 3 4

Related Concepts Successful Test Management Related Procedures Managing Folders Managing Projects Related Reference Projects Unit Interface

564

Editing Folders
To edit an existing folder:
1

To edit a test executions folder, click Execution on the workflow bar. To edit a reports folder, click Reports on the workflow bar. Select a folder in the Reports/Execution tree. Click Edit Folder on the toolbar. On the Edit Folder dialog box, edit the Name and Description of the folder. Check the Share this folder with other users check box if you want to make this folder available to other users. Click OK to accept your changes.

2 3 4 5 6

Related Concepts Successful Test Management Related Procedures Managing Folders Managing Projects Related Reference Projects Unit Interface

565

Pasting Folders
To paste a folder:
1

To paste a test executions folder, click Execution on the workflow bar. To paste a reports folder, click Reports on the workflow bar. Select an existing node (report, execution definition, or other folder) in the Reports/Execution tree where you want the copied folder to appear. Click Paste on the toolbar. The folder will appear on the same node level as the destination node you select.

2 3 4

Related Concepts Successful Test Management Related Procedures Managing Folders Managing Projects Related Reference Projects Unit Interface

566

Pasting Folders as Child Folders


To paste a folder as a child folder:
1

To paste a test execution folder as a child folder, click Execution on the workflow bar. To paste a reports folder as a child folder, click Reports on the workflow bar. Select an existing node (report, execution definition, or other folder) in the Reports/Execution tree where you want the copied folder to appear. Click Paste as child on the toolbar. The folder will appear as a sub-node of the selected node.

2 3 4

Related Concepts Successful Test Management Related Procedures Managing Folders Managing Projects Related Reference Projects Unit Interface

567

Sorting Folders
To move a folder up or down within the Reports and Execution trees
1

To move a test executions folder, click Execution on the workflow bar. To move a reports folder, click Reports on the workflow bar. Select the folder you want to move. Click either Move Up or Move Down on the toolbar.

2 3

Related Concepts Successful Test Management Related Procedures Managing Folders Managing Projects Related Reference Projects Unit Interface

568

Adding Folders
To add a new folder:
1

To add a test executions folder, click Execution on the workflow bar. To add a reports folder, click Reports on the workflow bar. Select an existing node (report, execution definition, or other folder) in the tree where you want the new folder to display. The folder will display as a sub-node of the selected folder level. Click Add Folder on the toolbar. The New Folder dialog box displays. Specify a Name and Description for the folder. Check the Share this folder with other users check box if you want to make this folder available to other users. Click OK to create the folder.

2 3 4 5 6

Related Concepts Successful Test Management Related Procedures Managing Folders Managing Projects Related Reference Projects Unit Interface

569

Creating Build Information Files


To create a build information file
1 2

On both the application and execution servers, navigate to: C:\Documents and Settings\All Users\Application Data\Borland\SCC32\BuildInfos Create a build info file for your project based on the template file BuildInfoExample.xml (shown in the following and available at the previous path).
<?xml version="1.0" encoding="utf-8"?><ProjectBuildInfo><BuildEntryList> <BuildEntry name="Demo Product"> <Version>3.1</Version> <BuildNr>350</BuildNr> </BuildEntry> name="Product2"> <Version>4.2</Version> <BuildNr>613</BuildNr> </BuildEntry> BuildEntryList></ProjectBuildInfo>

<BuildEntry </

Note:

To improve the structure of build information files, an element called BuildEntryList which contains a list of BuildEntry elements has been created. BuildEntry tags refer to specific products that are defined by the name attribute of BuildEntry elements.

Modify the file content to fit your environment.

Version (used on both application and execution servers): Version number currently available for testing
(not necessarily the same for each execution server). necessarily the same for each execution server).

BuildNr (used on both application and execution servers): Build number currently available for testing (not
4

Distribute the build information file to the execution servers: C:\Documents and Settings\All Users\Application Data\Borland\SCC32\BuildInfos Note: When stored on both the application server and execution servers, build information files must have the exact same name.

Once you have created the build information files on the application server and each execution server, you must specify the file name in the settings of the corresponding project. Select the Projects unit to view the list of projects assigned to you. Select the project to which you want to link the build information. Note: This must be done before the scheduling of any test definitions for the project. Otherwise previously scheduled test definitions will not be updated.

6 7

Select the Project Settings tab. Click Edit to edit the project settings of the selected project on the Edit Project Settings dialog box. Specify the name of the previously created XML file in the Build information file name field. Click OK to update the information. With all future test executions, Test Manager will read build information from the corresponding file and match test results with that information.

570

Related Concepts Build Information Build Information Updates Successful Test Management Related Procedures Managing Projects Managing a Successful Test Related Reference Projects Unit Interface

571

Selecting Projects
To select a project
1 2

Navigate to Projects

Projects.

Click the name of any project to select it.

Related Concepts Test Definition Execution Related Procedures Managing Projects Executing Test Definitions Related Reference Projects tab

572

Managing Activities
This section explains how to manage upcoming, current, and recently-executed test runs. In This Section Deleting Last Executions Runs How to delete a run from the Last Executions list. Displaying/Hiding Columns on the Activities Page Describes how to display/hide columns on the Activities page. Entering Issues From the Activities Tab Explains how to enter issues from the Activities tab. Filtering Test Runs on the Activities Page Describes how to filter test results and execution definitions on the Activities page. Grouping Test Runs on the Activities Page Describes how to group execution definitions or test results for easier viewing on the Activities page. Removing Activities Filters Explains how to remove filters that have been applied to columns on the Activities page. Reordering Columns on the Activities Page Describes how to reorder columns on the Activities page. Resizing Columns on the Activities Page Describes how to change the width of columns on the Activities page. Restoring Default Activities Page View Settings Explains how to restore the default view settings on the Activities page. Sorting Test Runs on the Activities Page Describes how to sort test runs on the Activities page.

573

Deleting Last Executions Runs


To delete a run from the Last Executions list
1 2 3

Click Activities on the workflow bar. In the Last Executions area of the Activities tab, right-click the test run you want to delete and select Delete Run Results. A dialog box displays, asking you to confirm the deletion. Click OK.

Related Concepts Project Management Test Definition Execution Related Procedures Managing Projects Executing Test Definitions Related Reference Activities Page

574

Displaying/Hiding Columns on the Activities Page


To display/hide columns on the Activities page:
1 2 3 4

Click Activities on the workflow bar. Right-click a column header. Expand the Columns submenus to view all the columns that are available in the project. Select the checkboxes of all the columns that you want to have displayed. Your column-display preferences will be saved and displayed each time you open the active project.

Related Concepts Project Management Test Definition Execution Related Procedures Managing Projects Executing Test Definitions Related Reference Activities Page

575

Entering Issues From the Activities Tab


New issues can be entered directly on the Activities tab.

To enter an issue from the Activities tab


1 2 3

Click Activities on the workflow bar. In the Last Executions area, click the Run ID of the relevant execution definition to view test-execution results. Each test definition associated with the execution run is listed in the Assigned Test Definitions table at the bottom of the view. Click Create a new issue for this test definition (in the Actions column) of the test definition to which you want to associate the issue. Proceed with defining the issue.

Related Concepts Project Management Test Definition Execution Related Procedures Managing Projects Executing Test Definitions Related Reference Activities Page

576

Filtering Test Runs on the Activities Page


You can filter the views on the Activities page based on column values. You can specify filter strings to be applied to text-based data fields, calendar filters (using Before, After, or On operators) for date-based fields, and numerical operators (>, <, and =) for number-based fields.

To filter text-based values on the Activities page:


1 2 3 4 5

Click Activities on the workflow bar. Right-click the header of the text-based column that the filter is to be based on. Expand the Filter submenu on the context menu to display the Filter text box. Enter a text string into the text box. Press ENTER. All entries that match the filter criteria (for example, in the case of execution-definition names, all executiondefinition names that include the specified string) are then dynamically displayed in the filtered list.

To filter date-based values on the Activities page:


1 2 3 4

Click Activities on the workflow bar. Right-click the header of the date-based column that the filter is to be based on. Hold your cursor over Filter on the context menu to display the Before, After, or On submenu. Hold your cursor over After to define a date before which (and including) all entries should be excluded. Hold your cursor over Before to define a date after which (and including) all entries should be excluded. Hold your cursor over On to exclude all entries except those that have the specified date. The calendar tool displays. Select a date using the calendar tool (or click Today to specify today's date). Tip: You must explicitly click a date on the calendar tool or press ENTER to activate date-based filtering changes.

All entries that match the filter criteria are then dynamically displayed in the filtered list.

To filter number-based values on the Activities page:


1 2 3 4

Click Activities on the workflow bar. Right-click the header of the number-based column that the filter is to be based on. Expand the Filter submenu on the context menu to display the > (greater than), < (less than), and = (equals) operators. Enter a number in the > text box to define a number less than which (and including) all entries should be excluded. Enter a number in the < text box to define a number greater than which (and including) all entries should be excluded. Enter a number in the = text box to exclude all entries except those that have the specified number. Note: Number values are rounded to two decimal places.

Press ENTER.

577

All entries that match the filter criteria are then dynamically displayed in the filtered list.

To filter Boolean values on the Activities page:


1 2 3 4

Click Activities on the workflow bar. Right-click the header of the Boolean-based column that the filter is to be based on. Expand the Filter submenu on the context menu to display the available values. Click one of the Yes or No option buttons. All entries that match the filter criteria are then dynamically displayed in the filtered list.

To filter values using a predefined list on the Activities page:


1 2 3 4

Click Activities on the workflow bar. Right-click the header of the column that has a predefined filter value that the filter is to be based on. Expand the Filter submenu on the context menu to display the available values. Check the check boxes of the filter values that you are interested in. All entries that have one of the selected criteria will be displayed.

Related Concepts Project Management Test Definition Execution Related Procedures Managing Projects Executing Test Definitions Related Reference Activities Page

578

Grouping Test Runs on the Activities Page


Beyond simply sorting by column, you can chunk entries into groups to facilitate viewing. Groups are based on commonly-shared values within the column that grouping is based on.

To group entries on the Activities page:


1 2 3

Click Activities on the workflow bar. Right-click the header of the column that the sort is to be based on. Select Group by This Field. Entries are then organized into groups based on commonly-shared values within the column you have selected.

To remove grouping:
1 2 3

Click Activities on the workflow bar. Right-click any column. Uncheck the Show in Groups check box.

Related Concepts Project Management Test Definition Execution Related Procedures Managing Projects Executing Test Definitions Related Reference Activities Page

579

Removing Activities Filters


Note: Hiding a column removes all filters that have been applied to the column.

To remove a specific filter:


1

Click Activities on the workflow bar. Note: You can identify filtered columns by their titles, which are displayed in bold, italic text.

2 3

Right-click the header of the column that has the filter you want to remove. Uncheck the Filter check box.

To remove all filters:


1 2 3

Click Activities on the workflow bar. Right-click any column header. Select Reset Filters.

Related Concepts Project Management Test Definition Execution Related Procedures Managing Projects Executing Test Definitions Working with Filters Related Reference Activities Page

580

Reordering Columns on the Activities Page


To reorder columns on the Activities page:
1 2 3

Click Activities on the workflow bar. Select the column header of the column you want to move. Drag the column to the desired position and release it. Your column-order preferences will be saved and displayed each time you open the active project.

Related Concepts Project Management Test Definition Execution Related Procedures Managing Projects Executing Test Definitions Related Reference Activities Page

581

Resizing Columns on the Activities Page


To adjust the width of columns on the Activities page:
1 2 3

Click Activities on the workflow bar. Select the vertical column-header divider of the column you want to adjust. Drag the column boundary to the desired position and release it. Your column-width preferences will be saved and displayed each time you open the active project.

Related Concepts Project Management Test Definition Execution Related Procedures Managing Projects Executing Test Definitions Related Reference Activities Page

582

Restoring Default Activities Page View Settings


Restoring default view settings resets all user-defined settings (column order, column width, shown/hidden columns, applied filters, sorting, and grouping) for the current project.

To restore default view settings:


1 2 3

Click Activities on the workflow bar. Right-click any column header. Select Reset View.

Related Concepts Project Management Test Definition Execution Related Procedures Managing Projects Executing Test Definitions Related Reference Activities Page

583

Sorting Test Runs on the Activities Page


To sort test runs on the Activities page:
1 2 3

Click Activities on the workflow bar. Right-click the header of the column you want the test runs to be sorted by. Select Sort Ascending to have the test runs sorted in ascending order (or select Sort Descending to have the test runs sorted in descending order). Your sort preferences will be saved and displayed each time you open the active project.

Related Concepts Project Management Test Definition Execution Related Procedures Managing Projects Executing Test Definitions Related Reference Activities Page

584

Managing Reports
This section explains how to work with reports in Test Manager. In This Section Creating Reports This section explains how to create reports in Test Manager. Customizing Reports with BIRT This section explains how to customize Test Manager reports using BIRT. Generating Reports This section explains how to generate reports in Test Manager. Adding Subreports Describes how to add subreports. Deleting Subreports Describes how to delete a subreport. Displaying Charts Describes how to display a chart. Accessing MRU (Most Recently Used) Reports Explains how to access a recently-viewed report. Editing Report Parameters Describes how to edit report parameters. Editing Report Properties Describes how to edit report properties. Printing Charts Describes how to print charts. Removing Charts Describes how to remove charts.

585

Creating Reports
This section explains how to create reports in Test Manager. In This Section Creating New Reports How to create a new report. Writing Advanced Queries with SQL How to write advanced SQL queries for reporting.

586

Creating New Reports


To create a new report:
1

Click Reports on the workflow bar. In the Reports directory tree, select the folder in which you want the new report to appear. This determines where the report will be stored in the directory tree. Click New Child Report on the toolbar. On the Create New Report dialog box, enter the name of the new report. This is the name that will appear in the Reports tree. Check the Share this report with other users check box if you want to make this report available to other users.

3 4 5

In the Timeout [s] field, specify the maximum time period in seconds that Test Manager should wait for SQL queries to complete. From the Default tab list box, select the tab that you want to be directed to when you select this report from one of the context-sensitive report lists. Select the corresponding result type from the Result Category list box. This setting specifies the database table and view that is to be filtered for the report. The following result types are available:

Requirement - Returns requirements available in the requirements module that meet the query criteria. Test Definition - Returns test definitions available in the Test Plan module that meet the query
criteria.

Test Definition Execution - Returns executed test definition results from the Executions module
that meet the query criteria.

Execution Definition - Returns execution definitions from the execution module. Issue - Returns issues (including imported issues). Requirement Progress Builds - Contains information on requirements progress per build so that
you can see how requirements develop across builds. on a daily basis.

Requirement Progress Days - The same as Requirement Progress Builds, but shows development Test Definition Progress Builds - Shows how test definitions develop across builds. Test Definition Progress Days - Same as above, but shows development on a daily basis.
Each result type offers a set of selection criteria. Based on the Result Type you have selected, specify an appropriate Selection Criteria for your report. These criteria typically group properties based on a view or some other intuitive grouping (for example, custom properties).
6 7

From the Property list box, select the property that is to be filtered on. For some selection criteria, properties are dynamic. Select an Operator for the query. The available operators depend on the property. Example operators are =, not, like, not like. Strings are always compared lowercase. Allowed wildcards for strings are * and ? (where * matches any characters and ? matches exactly one character). Select or specify the Value that the query is to be filtered on. For date-based properties, the Value field is replaced with a calendar tool that you can use to select a specific date. (optional) To add an additional query string to this report, click More. An existing query string can be deleted by clicking the strings Delete button. When multiple query strings are defined, AND and OR radio buttons appear next to the More button. Use these option buttons to define if the queries should be considered cumulatively (ALL), or if only one query strings criteria needs to be met (OR).

587

Click Next to configure report columns on the New Report dialog box.

To create columns:
1

Click Add Columns to display the Add Columns dialog box. All available report columns are listed. Select those that you want to have included in the report and click OK (multiple columns can be selected by holding down the CTRL key). Note: For test-planning reports, the list of available column names is enhanced with the column names from the LQM_v_tests table. See SilkCentral database documentation for full details.

The selected columns appear in tabular format on the New Report dialog box. From here you can configure how each report column is to be displayed. For each column, specify a sort direction (ascending, descending, or unsorted) using the up/down arrows in the Sorting column. When a column is selected for sorting, a list box is displayed in the Sort Order column that allows you to more easily edit the column-sort order. Set these numbers as required. Give each column an Alias. This is the name by which each column will be labeled in the generated report. With grouping, you can take advantage of SQL aggregation features (for example, selecting a number of elements or querying a total sum of values). Check the Group by check box on the column selection dialog to specify that SQL group by functions are to be applied. Columns that are not selected for SQL group by functions are set to aggregation by default (meaning, a single aggregate value will be calculated). From the Aggregation list box, select the appropriate aggregation type (Count, Sum, Average, Minimum, or Maximum). The Actions column enables you to move column listings up and down in the view. The Move Up and Move Down functions do not affect the outcome of the report. Note: Any report column can be deleted by clicking the columns Delete button.

3 4 5

Click Finish to complete your new report.

Related Concepts New Reports Report Generation Related Procedures Analyzing Test Results - Quick Start Task Creating Reports Customizing Reports with BIRT Generating Reports Using Context-Sensitive Reports Managing Reports Related Reference Reports Unit Interface

588

Writing Advanced Queries with SQL


Advanced reports can be created through manual SQL coding. Virtually any reporting option is available if you know the database schema. Clicking Advanced hides the query string list boxes explained in the section above and opens a Report data query field in which you can insert existing code or write new SQL code. One approach is to begin query-string construction using the list boxes as outlined above (if the report criteria are valid, the equivalent SQL statement will be generated and displayed), and then move to advanced mode for further modifications. Note: Note that you cannot move from advanced mode back to simple mode.

To write an advanced query directly in SQL


1 2 3 4 5 6 7

Click Reports on the workflow bar. In the Reports directory tree, select the folder in which you want the new report to appear (Requirements, Test Plan, Issues, etc.). This determines where the report will be stored in the directory tree. Click Create New Report on the toolbar. On the Create New Report dialog box, enter the name of the new report. This is the name that will appear in the Reports tree. Check the Share this report with other users check box if you want to make this report available to other users. Enter a description of the report in the Description field. Click Advanced to open the Report data query field. Insert previously written code as necessary, or write new code directly in the field. To assist you in writing SQL queries, a list box of Test Manager function placeholders is available. See the following section for details regarding available placeholders. To insert one of the available pre-defined functions, select the corresponding placeholder from the Insert placeholder list box. Note: If you manually edit SQL code for the query, click Check SQL to confirm your work.

Once you have completed editing the reports properties, click Finish to save your settings.

Related Concepts New Reports Report Generation Related Procedures Analyzing Test Results - Quick Start Task Creating Reports Customizing Reports with BIRT Generating Reports Using Context-Sensitive Reports Managing Reports Related Reference Reports Unit Interface

589

Customizing Reports with BIRT


This section explains how to customize Test Manager reports using BIRT. In This Section Customizing BIRT Report Templates How to customize reports using BIRT RCP Designer. Downloading Report Templates How to download report templates to edit them.

590

Customizing BIRT Report Templates


With BIRT RCP Designer, you can customize Test Managers pre-installed report templates and create custom report templates; see SilkCentral Administration Module documentation and BIRT RCP Designer help for details. Modified report templates can be uploaded using the Upload link on the Report tab.

To download an existing template for editing:


1 2 3 4 5

Select a report that utilizes the BIRT Report Template from Test Manager Select the Properties tab. Click Download BIRT Report Template.

Reports in the menu tree.

You receive the report data as a generic BIRT report template (empty). The datasource is already configured. Once you have saved the template to your local system, modify it as required. Once complete, upload it using the Upload link on the Report tab. For detailed information on configuring BIRT report templates, please refer to the SilkCentral Administration Module Help.

Related Concepts New Reports Report Generation Related Procedures Analyzing Test Results - Quick Start Task Creating Reports Customizing Reports with BIRT Generating Reports Managing Reports Related Reference Report Properties tab

591

Downloading Report Templates


Downloading Test Manager report templates to your local system enables you to edit them through BIRT Report Designer or Microsoft Excel. After you download and edit a report, you can upload it to make it available to other users. For details see the related Uploading Report Templates procedure.

To download an existing report template for editing:


1 2 3

Select a report that utilizes the template you want to modify from Test Manager Select the Properties tab.

Reports in the menu tree.

Click the download link of the template you want to download. The available download links are:

Download Excel Report Template Download BIRT Report Template Download as CSV Download as XML
Here are details about the available template formats:

MS Excel - You receive an MS Excel file with a sheet named DATA that contains the data (for example, BIRT report template - You receive the report data as a generic BIRT report template (empty). The
datasource is already configured.

in CSV format). This is the only affected sheet in the template, so you can specify information in adjoining sheets (for example, diagrams).

CSV (Comma Separated Values) - You receive the report data as a CSV file. Depending on your local
settings, you will receive , or ; as the delimiter character. The date is also formatted based on user settings.

XML You receive the report data as XML. The advantage of this approach over CSV is that you retain
all subreport data. Accessing data outside of Test Manager - You can call a specific URL that offers the report data using the following format: http://server/servicesExchange? hid=reportData&userName=<username>&passWord=<password>&reportFilterID=<ID of the report>&type=<csv|xml>

4 5

The File Download dialog box displays. Click Save and download the report file to your local system as a .rptdesign or .xls file, depending on the report type that you are downloading. Now edit the report based on your needs using either BIRT RCP Designer (for .rptdesign files) or Excel (for .xls files).

592

Related Concepts New Reports Report Generation Related Procedures Analyzing Test Results - Quick Start Task Uploading Report Templates Creating Reports Customizing Reports with BIRT Generating Reports Managing Reports Related Reference Report tab

593

Generating Reports
This section explains how to generate reports in Test Manager. In This Section Using Context-Sensitive Reports This section explains how to enable and access context-sensitive reports. Removing Report Templates Describes how to remove the current report's template. Saving Reports Describes how to save a report. Uploading Report Templates Describes how to upload a template from your local system. Viewing a Report as a PDF Describes how to view a report in PDF format. Viewing Reports Describes how to generate a report.

594

Using Context-Sensitive Reports


This section explains how to enable and access context-sensitive reports. In This Section Accessing Context-Sensitive Reports This section explains how to access each unit's context-sensitive reports. Enabling Context-Sensitive Reports This section explains how to enable each unit's context-sensitive reports.

595

Accessing Context-Sensitive Reports


This section explains how to access each unit's context-sensitive reports. In This Section Accessing Context-Sensitive Execution Reports Explains how to access context-sensitive execution-definition and execution-definition-run reports. Accessing Context-Sensitive Requirements Reports Explains how to access context-sensitive requirements reports. Accessing Context-Sensitive Test-Definition Reports Explains how to access context-sensitive test-definition reports.

596

Accessing Context-Sensitive Execution Reports


Explains how to access context-sensitive execution-definition and execution-definition-run reports. Note: Reports must be enabled as context-sensitive reports to make them available in the Executions unit.

To access a context-sensitive execution-definition report:


1 2 3

Click Executions on the workflow bar to go to the Executions unit. Right-click an execution definition in the Executions menu tree and choose Reports. Select a report from the Reports sub-menu. You are then taken to the report's Parameters tab in the Reports unit where the execution-definition's ID is pre-populated as a value. Note: You can configure this destination-tab linking behavior using each report's Edit Report dialog box.

4 5

Edit the report's parameters as required. Advance to the report's Data, Report, or Chart tab to complete configuration of the report.

To access a context-sensitive execution-definition-run report:


1 2 3 4

Click Executions on the workflow bar to go to the Executions unit. Click the Runs tab. Right-click a run and choose Reports. Select a report from the Reports sub-menu. You are then taken to the report's Parameters tab in the Reports unit where the run's ID is pre-populated as a value. Note: You can configure this destination-tab linking behavior using each report's Edit Report dialog box.

5 6

Edit the report's parameters as required. Advance to the report's Data, Report, or Chart tab to complete configuration of the report.

Related Concepts Context-Sensitive Reports Related Procedures Accessing Context-Sensitive Reports Writing Advanced Queries with SQL Enabling Context-Sensitive Reports Creating New Reports

597

Accessing Context-Sensitive Requirements Reports


Explains how to access context-sensitive requirements reports. Note: Context-sensitive reports are available in the Requirements unit only for those reports that accept a requirement ID as an input parameter.

To access a context-sensitive requirements report:


1 2 3

Click Requirements on the workflow bar to go to the Requirements unit. Right-click a requirement in the Requirements menu tree and choose Reports. Select a report from the Reports sub-menu. You are then taken to the report's Parameters tab in the Reports unit where the requirement's ID is prepopulated as a value. Note: You can configure this destination-tab linking behavior using each report's Edit Report dialog box.

4 5

Edit the report's parameters as required. Advance to the report's Data, Report, or Chart tab to complete configuration of the report.

Related Concepts Context-Sensitive Reports Related Procedures Accessing Context-Sensitive Reports Enabling Context-Sensitive Reports Writing Advanced Queries with SQL Creating New Reports

598

Accessing Context-Sensitive Test-Definition Reports


Explains how to access context-sensitive test-definition reports. Note: Context-sensitive reports are available in the Test Plan unit only for those reports that accept a test-definition ID as an input parameter.

To access a context-sensitive test-definition report:


1 2

Click Test Plan on the workflow bar to go to the Test Plan unit. Right-click a test definition in the Test Plan menu tree or the Test Plan Grid View and choose Reports. Note: When multi-selecting test definitions in the test plan Grid View, the context-sensitive reporting is disabled.

Select a report from the Reports sub-menu. You are then taken to the report's Parameters tab in the Reports unit where the test-definition's ID is prepopulated as a value. Note: You can configure this destination-tab linking behavior using each report's Edit Report dialog box.

4 5

Edit the report's parameters as required. Advance to the report's Data, Report, or Chart tab to complete configuration of the report.

Related Concepts Context-Sensitive Reports Related Procedures Accessing Context-Sensitive Reports Writing Advanced Queries with SQL Enabling Context-Sensitive Reports Creating New Reports

599

Enabling Context-Sensitive Reports


This section explains how to enable each unit's context-sensitive reports. In This Section Enabling Context-Sensitive Execution Reports Explains how to enable execution-definition and execution-definition-run reports to appear in contextsensitive report lists. Enabling Context-Sensitive Requirements Reports Explains how to enable requirements reports to appear in the context-sensitive report list. Enabling Context-Sensitive Test-Plan Reports Explains how to enable test-plan reports to appear in the context-sensitive report list.

600

Enabling Context-Sensitive Execution Reports


Explains how to enable execution-definition and execution-definition-run reports to appear in context-sensitive report lists.

To enable a simple report to appear in context-sensitive report lists in the execution tree or the runs tab:
1

Complete the steps involved in creating a simple report (detailed in the procedure linked below), though alter the procedure with the following two steps. Creating New Reports Select Execution Definition from the Result category list box. Select the selection criteria for the context-sensitive report:

2 3

Select Execution Definition Property from the Selection criteria list box. Select Execution Definition Run from the Selection criteria list box.
4 5 6

Select ID from the Property list box. Enter a value in the Value field (for example, the ID number of an existing execution definition or an existing execution definition run). Click Finish.

To enable an advanced report to appear in context-sensitive report lists in the execution tree or on the runs tab:
1

Create a report that includes:

An execution-definition ID as an input parameter for the report to appear in the execution tree. An execution-definition-run ID as an input parameter for the report to appear on the runs tab.
To do this, complete the steps involved in creating an advanced query (detailed in the procedure linked below), though alter the procedure with the following steps. Writing Advanced Queries with SQL
2

To make an advanced query available in the Executions context menu, insert the parameter name execProp_Id_0 as input for ExecDef_ID_pk_fk. For example, your report's SQL statement may have defined a hard-coded database-column value, such as ExecDef_ID_pk_fk = 68. To edit this report so that it receives column-name values dynamically, replace the static value of 68 with the following notation: $ {execProp_Id_0 | 68}

Note: Consult SilkCentral Test Manager database documentation to find out additional information about tables and column-name definitions.

601

Related Concepts Context-Sensitive Reports Related Procedures Accessing Context-Sensitive Reports Enabling Context-Sensitive Reports

602

Creating New Reports


To create a new report:
1

Click Reports on the workflow bar. In the Reports directory tree, select the folder in which you want the new report to appear. This determines where the report will be stored in the directory tree. Click New Child Report on the toolbar. On the Create New Report dialog box, enter the name of the new report. This is the name that will appear in the Reports tree. Check the Share this report with other users check box if you want to make this report available to other users.

3 4 5

In the Timeout [s] field, specify the maximum time period in seconds that Test Manager should wait for SQL queries to complete. From the Default tab list box, select the tab that you want to be directed to when you select this report from one of the context-sensitive report lists. Select the corresponding result type from the Result Category list box. This setting specifies the database table and view that is to be filtered for the report. The following result types are available:

Requirement - Returns requirements available in the requirements module that meet the query criteria. Test Definition - Returns test definitions available in the Test Plan module that meet the query
criteria.

Test Definition Execution - Returns executed test definition results from the Executions module
that meet the query criteria.

Execution Definition - Returns execution definitions from the execution module. Issue - Returns issues (including imported issues). Requirement Progress Builds - Contains information on requirements progress per build so that
you can see how requirements develop across builds. on a daily basis.

Requirement Progress Days - The same as Requirement Progress Builds, but shows development Test Definition Progress Builds - Shows how test definitions develop across builds. Test Definition Progress Days - Same as above, but shows development on a daily basis.
Each result type offers a set of selection criteria. Based on the Result Type you have selected, specify an appropriate Selection Criteria for your report. These criteria typically group properties based on a view or some other intuitive grouping (for example, custom properties).
6 7

From the Property list box, select the property that is to be filtered on. For some selection criteria, properties are dynamic. Select an Operator for the query. The available operators depend on the property. Example operators are =, not, like, not like. Strings are always compared lowercase. Allowed wildcards for strings are * and ? (where * matches any characters and ? matches exactly one character). Select or specify the Value that the query is to be filtered on. For date-based properties, the Value field is replaced with a calendar tool that you can use to select a specific date. (optional) To add an additional query string to this report, click More. An existing query string can be deleted by clicking the strings Delete button. When multiple query strings are defined, AND and OR radio buttons appear next to the More button. Use these option buttons to define if the queries should be considered cumulatively (ALL), or if only one query strings criteria needs to be met (OR).

603

Click Next to configure report columns on the New Report dialog box.

To create columns:
1

Click Add Columns to display the Add Columns dialog box. All available report columns are listed. Select those that you want to have included in the report and click OK (multiple columns can be selected by holding down the CTRL key). Note: For test-planning reports, the list of available column names is enhanced with the column names from the LQM_v_tests table. See SilkCentral database documentation for full details.

The selected columns appear in tabular format on the New Report dialog box. From here you can configure how each report column is to be displayed. For each column, specify a sort direction (ascending, descending, or unsorted) using the up/down arrows in the Sorting column. When a column is selected for sorting, a list box is displayed in the Sort Order column that allows you to more easily edit the column-sort order. Set these numbers as required. Give each column an Alias. This is the name by which each column will be labeled in the generated report. With grouping, you can take advantage of SQL aggregation features (for example, selecting a number of elements or querying a total sum of values). Check the Group by check box on the column selection dialog to specify that SQL group by functions are to be applied. Columns that are not selected for SQL group by functions are set to aggregation by default (meaning, a single aggregate value will be calculated). From the Aggregation list box, select the appropriate aggregation type (Count, Sum, Average, Minimum, or Maximum). The Actions column enables you to move column listings up and down in the view. The Move Up and Move Down functions do not affect the outcome of the report. Note: Any report column can be deleted by clicking the columns Delete button.

3 4 5

Click Finish to complete your new report.

Related Concepts New Reports Report Generation Related Procedures Analyzing Test Results - Quick Start Task Creating Reports Customizing Reports with BIRT Generating Reports Using Context-Sensitive Reports Managing Reports Related Reference Reports Unit Interface

604

Writing Advanced Queries with SQL


Advanced reports can be created through manual SQL coding. Virtually any reporting option is available if you know the database schema. Clicking Advanced hides the query string list boxes explained in the section above and opens a Report data query field in which you can insert existing code or write new SQL code. One approach is to begin query-string construction using the list boxes as outlined above (if the report criteria are valid, the equivalent SQL statement will be generated and displayed), and then move to advanced mode for further modifications. Note: Note that you cannot move from advanced mode back to simple mode.

To write an advanced query directly in SQL


1 2 3 4 5 6 7

Click Reports on the workflow bar. In the Reports directory tree, select the folder in which you want the new report to appear (Requirements, Test Plan, Issues, etc.). This determines where the report will be stored in the directory tree. Click Create New Report on the toolbar. On the Create New Report dialog box, enter the name of the new report. This is the name that will appear in the Reports tree. Check the Share this report with other users check box if you want to make this report available to other users. Enter a description of the report in the Description field. Click Advanced to open the Report data query field. Insert previously written code as necessary, or write new code directly in the field. To assist you in writing SQL queries, a list box of Test Manager function placeholders is available. See the following section for details regarding available placeholders. To insert one of the available pre-defined functions, select the corresponding placeholder from the Insert placeholder list box. Note: If you manually edit SQL code for the query, click Check SQL to confirm your work.

Once you have completed editing the reports properties, click Finish to save your settings.

Related Concepts New Reports Report Generation Related Procedures Analyzing Test Results - Quick Start Task Creating Reports Customizing Reports with BIRT Generating Reports Using Context-Sensitive Reports Managing Reports Related Reference Reports Unit Interface

605

Enabling Context-Sensitive Requirements Reports


Explains how to enable requirements reports to appear in the context-sensitive report list.

To have a simple report appear in the Requirements context-sensitive report list:


1

Complete the steps involved in creating a simple report (detailed in the procedure linked below), though alter the procedure with the following two steps. Creating New Reports Select Requirement from the Selection criteria list box. Enter a value in the Value field (for example, the ID number of an existing requirement).

2 3

To make an advanced report appear in the Requirements context-sensitive report list:


1

Create a report that includes a requirement ID as an input parameter. To do this, complete the steps involved in creating an advanced query (detailed in the procedure linked below), though alter the procedure with the following steps. Writing Advanced Queries with SQL To make an advanced query available in the Requirements context menu, insert the parameter name reqProp_Id_0 as input for Req_ID_pk_fk. For example, your report's SQL statement may have defined a hard-coded database-column value, such as Req_ID_pk_fk = 68. To edit this report so that it receives column-name values dynamically, replace the static value of 68 with the following notation: ${reqProp_Id_0 | 68}

Note: Consult SilkCentral Test Manager database documentation to find out additional information about tables and column-name definitions. Related Concepts Context-Sensitive Reports Related Procedures Accessing Context-Sensitive Reports Enabling Context-Sensitive Reports Writing Advanced Queries with SQL Creating New Reports

606

Creating New Reports


To create a new report:
1

Click Reports on the workflow bar. In the Reports directory tree, select the folder in which you want the new report to appear. This determines where the report will be stored in the directory tree. Click New Child Report on the toolbar. On the Create New Report dialog box, enter the name of the new report. This is the name that will appear in the Reports tree. Check the Share this report with other users check box if you want to make this report available to other users.

3 4 5

In the Timeout [s] field, specify the maximum time period in seconds that Test Manager should wait for SQL queries to complete. From the Default tab list box, select the tab that you want to be directed to when you select this report from one of the context-sensitive report lists. Select the corresponding result type from the Result Category list box. This setting specifies the database table and view that is to be filtered for the report. The following result types are available:

Requirement - Returns requirements available in the requirements module that meet the query criteria. Test Definition - Returns test definitions available in the Test Plan module that meet the query
criteria.

Test Definition Execution - Returns executed test definition results from the Executions module
that meet the query criteria.

Execution Definition - Returns execution definitions from the execution module. Issue - Returns issues (including imported issues). Requirement Progress Builds - Contains information on requirements progress per build so that
you can see how requirements develop across builds. on a daily basis.

Requirement Progress Days - The same as Requirement Progress Builds, but shows development Test Definition Progress Builds - Shows how test definitions develop across builds. Test Definition Progress Days - Same as above, but shows development on a daily basis.
Each result type offers a set of selection criteria. Based on the Result Type you have selected, specify an appropriate Selection Criteria for your report. These criteria typically group properties based on a view or some other intuitive grouping (for example, custom properties).
6 7

From the Property list box, select the property that is to be filtered on. For some selection criteria, properties are dynamic. Select an Operator for the query. The available operators depend on the property. Example operators are =, not, like, not like. Strings are always compared lowercase. Allowed wildcards for strings are * and ? (where * matches any characters and ? matches exactly one character). Select or specify the Value that the query is to be filtered on. For date-based properties, the Value field is replaced with a calendar tool that you can use to select a specific date. (optional) To add an additional query string to this report, click More. An existing query string can be deleted by clicking the strings Delete button. When multiple query strings are defined, AND and OR radio buttons appear next to the More button. Use these option buttons to define if the queries should be considered cumulatively (ALL), or if only one query strings criteria needs to be met (OR).

607

Click Next to configure report columns on the New Report dialog box.

To create columns:
1

Click Add Columns to display the Add Columns dialog box. All available report columns are listed. Select those that you want to have included in the report and click OK (multiple columns can be selected by holding down the CTRL key). Note: For test-planning reports, the list of available column names is enhanced with the column names from the LQM_v_tests table. See SilkCentral database documentation for full details.

The selected columns appear in tabular format on the New Report dialog box. From here you can configure how each report column is to be displayed. For each column, specify a sort direction (ascending, descending, or unsorted) using the up/down arrows in the Sorting column. When a column is selected for sorting, a list box is displayed in the Sort Order column that allows you to more easily edit the column-sort order. Set these numbers as required. Give each column an Alias. This is the name by which each column will be labeled in the generated report. With grouping, you can take advantage of SQL aggregation features (for example, selecting a number of elements or querying a total sum of values). Check the Group by check box on the column selection dialog to specify that SQL group by functions are to be applied. Columns that are not selected for SQL group by functions are set to aggregation by default (meaning, a single aggregate value will be calculated). From the Aggregation list box, select the appropriate aggregation type (Count, Sum, Average, Minimum, or Maximum). The Actions column enables you to move column listings up and down in the view. The Move Up and Move Down functions do not affect the outcome of the report. Note: Any report column can be deleted by clicking the columns Delete button.

3 4 5

Click Finish to complete your new report.

Related Concepts New Reports Report Generation Related Procedures Analyzing Test Results - Quick Start Task Creating Reports Customizing Reports with BIRT Generating Reports Using Context-Sensitive Reports Managing Reports Related Reference Reports Unit Interface

608

Writing Advanced Queries with SQL


Advanced reports can be created through manual SQL coding. Virtually any reporting option is available if you know the database schema. Clicking Advanced hides the query string list boxes explained in the section above and opens a Report data query field in which you can insert existing code or write new SQL code. One approach is to begin query-string construction using the list boxes as outlined above (if the report criteria are valid, the equivalent SQL statement will be generated and displayed), and then move to advanced mode for further modifications. Note: Note that you cannot move from advanced mode back to simple mode.

To write an advanced query directly in SQL


1 2 3 4 5 6 7

Click Reports on the workflow bar. In the Reports directory tree, select the folder in which you want the new report to appear (Requirements, Test Plan, Issues, etc.). This determines where the report will be stored in the directory tree. Click Create New Report on the toolbar. On the Create New Report dialog box, enter the name of the new report. This is the name that will appear in the Reports tree. Check the Share this report with other users check box if you want to make this report available to other users. Enter a description of the report in the Description field. Click Advanced to open the Report data query field. Insert previously written code as necessary, or write new code directly in the field. To assist you in writing SQL queries, a list box of Test Manager function placeholders is available. See the following section for details regarding available placeholders. To insert one of the available pre-defined functions, select the corresponding placeholder from the Insert placeholder list box. Note: If you manually edit SQL code for the query, click Check SQL to confirm your work.

Once you have completed editing the reports properties, click Finish to save your settings.

Related Concepts New Reports Report Generation Related Procedures Analyzing Test Results - Quick Start Task Creating Reports Customizing Reports with BIRT Generating Reports Using Context-Sensitive Reports Managing Reports Related Reference Reports Unit Interface

609

Enabling Context-Sensitive Test-Plan Reports


Explains how to enable test-plan reports to appear in the context-sensitive report list.

To have a simple report appear in the Test Plan context-sensitive report list:
1

Complete the steps involved in creating a simple report (detailed in the procedure linked below), though alter the procedure with the following two steps. Creating New Reports Select Test Definition from the Selection criteria list box. Enter a value in the Value field (for example, the ID number of an existing test definition).

2 3

To make an advanced report appear in the Test Plan context-sensitive report list:
1

Create a report that includes an test definition ID as an input parameter. To do this, complete the steps involved in creating an advanced query (detailed in the procedure linked below), though alter the procedure with the following steps. Writing Advanced Queries with SQL To make an advanced query available in the Test Plan context menu, insert the parameter name tdProp_Id_0 as input for TestDef_ID_pk_fk. For example, your report's SQL statement may have defined a hard-coded database-column value, such as TestDef_ID_pk_fk = 68. To edit this report so that it receives column-name values dynamically, replace the static value of 68 with the following notation: ${tdProp_Id_0 | 68}

Note: Consult SilkCentral Test Manager database documentation to find out additional information about tables and column-name definitions. Related Concepts Context-Sensitive Reports Related Procedures Accessing Context-Sensitive Reports Enabling Context-Sensitive Reports Writing Advanced Queries with SQL Creating New Reports

610

Creating New Reports


To create a new report:
1

Click Reports on the workflow bar. In the Reports directory tree, select the folder in which you want the new report to appear. This determines where the report will be stored in the directory tree. Click New Child Report on the toolbar. On the Create New Report dialog box, enter the name of the new report. This is the name that will appear in the Reports tree. Check the Share this report with other users check box if you want to make this report available to other users.

3 4 5

In the Timeout [s] field, specify the maximum time period in seconds that Test Manager should wait for SQL queries to complete. From the Default tab list box, select the tab that you want to be directed to when you select this report from one of the context-sensitive report lists. Select the corresponding result type from the Result Category list box. This setting specifies the database table and view that is to be filtered for the report. The following result types are available:

Requirement - Returns requirements available in the requirements module that meet the query criteria. Test Definition - Returns test definitions available in the Test Plan module that meet the query
criteria.

Test Definition Execution - Returns executed test definition results from the Executions module
that meet the query criteria.

Execution Definition - Returns execution definitions from the execution module. Issue - Returns issues (including imported issues). Requirement Progress Builds - Contains information on requirements progress per build so that
you can see how requirements develop across builds. on a daily basis.

Requirement Progress Days - The same as Requirement Progress Builds, but shows development Test Definition Progress Builds - Shows how test definitions develop across builds. Test Definition Progress Days - Same as above, but shows development on a daily basis.
Each result type offers a set of selection criteria. Based on the Result Type you have selected, specify an appropriate Selection Criteria for your report. These criteria typically group properties based on a view or some other intuitive grouping (for example, custom properties).
6 7

From the Property list box, select the property that is to be filtered on. For some selection criteria, properties are dynamic. Select an Operator for the query. The available operators depend on the property. Example operators are =, not, like, not like. Strings are always compared lowercase. Allowed wildcards for strings are * and ? (where * matches any characters and ? matches exactly one character). Select or specify the Value that the query is to be filtered on. For date-based properties, the Value field is replaced with a calendar tool that you can use to select a specific date. (optional) To add an additional query string to this report, click More. An existing query string can be deleted by clicking the strings Delete button. When multiple query strings are defined, AND and OR radio buttons appear next to the More button. Use these option buttons to define if the queries should be considered cumulatively (ALL), or if only one query strings criteria needs to be met (OR).

611

Click Next to configure report columns on the New Report dialog box.

To create columns:
1

Click Add Columns to display the Add Columns dialog box. All available report columns are listed. Select those that you want to have included in the report and click OK (multiple columns can be selected by holding down the CTRL key). Note: For test-planning reports, the list of available column names is enhanced with the column names from the LQM_v_tests table. See SilkCentral database documentation for full details.

The selected columns appear in tabular format on the New Report dialog box. From here you can configure how each report column is to be displayed. For each column, specify a sort direction (ascending, descending, or unsorted) using the up/down arrows in the Sorting column. When a column is selected for sorting, a list box is displayed in the Sort Order column that allows you to more easily edit the column-sort order. Set these numbers as required. Give each column an Alias. This is the name by which each column will be labeled in the generated report. With grouping, you can take advantage of SQL aggregation features (for example, selecting a number of elements or querying a total sum of values). Check the Group by check box on the column selection dialog to specify that SQL group by functions are to be applied. Columns that are not selected for SQL group by functions are set to aggregation by default (meaning, a single aggregate value will be calculated). From the Aggregation list box, select the appropriate aggregation type (Count, Sum, Average, Minimum, or Maximum). The Actions column enables you to move column listings up and down in the view. The Move Up and Move Down functions do not affect the outcome of the report. Note: Any report column can be deleted by clicking the columns Delete button.

3 4 5

Click Finish to complete your new report.

Related Concepts New Reports Report Generation Related Procedures Analyzing Test Results - Quick Start Task Creating Reports Customizing Reports with BIRT Generating Reports Using Context-Sensitive Reports Managing Reports Related Reference Reports Unit Interface

612

Writing Advanced Queries with SQL


Advanced reports can be created through manual SQL coding. Virtually any reporting option is available if you know the database schema. Clicking Advanced hides the query string list boxes explained in the section above and opens a Report data query field in which you can insert existing code or write new SQL code. One approach is to begin query-string construction using the list boxes as outlined above (if the report criteria are valid, the equivalent SQL statement will be generated and displayed), and then move to advanced mode for further modifications. Note: Note that you cannot move from advanced mode back to simple mode.

To write an advanced query directly in SQL


1 2 3 4 5 6 7

Click Reports on the workflow bar. In the Reports directory tree, select the folder in which you want the new report to appear (Requirements, Test Plan, Issues, etc.). This determines where the report will be stored in the directory tree. Click Create New Report on the toolbar. On the Create New Report dialog box, enter the name of the new report. This is the name that will appear in the Reports tree. Check the Share this report with other users check box if you want to make this report available to other users. Enter a description of the report in the Description field. Click Advanced to open the Report data query field. Insert previously written code as necessary, or write new code directly in the field. To assist you in writing SQL queries, a list box of Test Manager function placeholders is available. See the following section for details regarding available placeholders. To insert one of the available pre-defined functions, select the corresponding placeholder from the Insert placeholder list box. Note: If you manually edit SQL code for the query, click Check SQL to confirm your work.

Once you have completed editing the reports properties, click Finish to save your settings.

Related Concepts New Reports Report Generation Related Procedures Analyzing Test Results - Quick Start Task Creating Reports Customizing Reports with BIRT Generating Reports Using Context-Sensitive Reports Managing Reports Related Reference Reports Unit Interface

613

Removing Report Templates


To remove the current reports template
1 2 3 4 5

Click Reports on the workflow bar. In the Reports tree, select the report from which you want to delete a template. Click the Report tab. Click the reports Delete icon. Select Yes on the subsequent confirmation dialog box.

Related Concepts New Reports Report Generation Related Procedures Analyzing Test Results - Quick Start Task Creating Reports Customizing Reports with BIRT Generating Reports Managing Reports Related Reference Report Properties tab

614

Saving Reports
How you save a report locally depends on whether you have selected a BIRT report template or an Excel template. If you have selected an Excel template, simply click the Download link on the Report tab. This will invoke Microsoft Excel on your local computer and the report will be loaded automatically. If you have selected a BIRT report template, use the following procedure to save the report.

To export the current BIRT report as PDF:


1 2 3 4 5

Click Reports on the workflow bar. In the Reports tree, select the report that you want to save. Click the Report tab. Click PDF on the Report view toolbar. On the File Download dialog box, click Save to save the PDF document to a location of your choice.

Related Concepts Report Generation Related Procedures Analyzing Test Results - Quick Start Task Customizing Reports with BIRT Generating Reports Managing Reports Related Reference Report tab

615

Uploading Report Templates


To upload a template from your local system
1 2 3 4 5 6 7 8

Click Reports on the workflow bar. Select the report to which you want to have the template associated. Select the Report tab. Click the Click here to upload a new report template link to open the Upload Template dialog box. Give the template a meaningful Name and Description. In the Projects field, select the project to which you would like to make the template available; or, select All Projects to have the template associated with all projects. Click Browse. Then browse to and select the template on your local system. Click OK to upload the template.

Related Concepts New Reports Report Generation Related Procedures Analyzing Test Results - Quick Start Task Downloading Report Templates Creating Reports Customizing Reports with BIRT Generating Reports Managing Reports Related Reference Report tab

616

Viewing a Report as a PDF


To view the current report in PDF format within the report browser frame:
1 2 3 4 5

Click Reports on the workflow bar. In the Reports tree, select the report that you want to view. Click the Report tab. Click PDF on the report view toolbar. The report then displays in PDF format.

Related Concepts Report Generation Related Procedures Analyzing Test Results - Quick Start Task Customizing Reports with BIRT Generating Reports Managing Reports Related Reference Report tab

617

Viewing Reports
Because each template expects a certain data format to produce a useful graph, not all templates can be applied to all report queries. You will receive an error message if you attempt to generate a report through an incompatible report template. For example, selecting the Four Values Per Row As Horizontal Bar template to display the Requirements Status Overview report works because this particular Microsoft Excel template requires exactly the four values (failed, passed, not executed, and not covered) that the report query delivers.

To generate a report
1 2 3 4 5 6 7

Click Reports on the workflow bar. In the Reports tree, select the report you want to generate. Select the Report tab. Click the Select Report Template icon. From the Select Report Template dialog box, select the template you wish to use. Click OK to display the report. (optional) If necessary, select an alternate view magnification for the report from the list box. 100% is the default magnification. Other options are 50%, 75%, 150%, and 200%.

Related Concepts Report Generation Related Procedures Analyzing Test Results - Quick Start Task Context-Sensitive Reports Generating Reports Managing Reports Related Reference Report tab

618

Adding Subreports
To aggregate the results from multiple reports into the currently selected report, you can add subreports. When adding a report as a subreport, the result columns and rows of the subreport are concatenated to the results of the selected report.

To add a report as a subreport


1 2 3 4 5

Click Reports on the workflow bar. Select a report in the Reports tree. On the Properties tab, click Add Subreport. The Add Subreport dialog box displays. Select the subreport you want to have appended to the current report by selecting it from the Reports tree-list. Click OK to complete the addition of the subreport. Subreports appear on the associated reports Properties tab in a section called Subreports.

Related Concepts Report Generation Related Procedures Creating New Reports Customizing BIRT Report Templates Creating Reports Generating Reports Managing Reports Related Reference Report tab

619

Deleting Subreports
To delete a subreport
1 2 3 4

Click Reports on the workflow bar. Select the report in the Reports tree that has the associated subreport that you want to delete. On the Properties tab, click the Delete icon (in the Action column of the Subreports table) of the subreport you want to delete. Click Yes on the confirmation dialog box to confirm the deletion.

Related Concepts Report Generation Related Procedures Creating New Reports Customizing BIRT Report Templates Creating Reports Generating Reports Managing Reports Related Reference Report tab

620

Displaying Charts
To display a chart
1 2 3 4 5 6 7

Click Reports on the workflow bar. Select a report in the Reports tree for which you want to view a chart. Select the Chart tab to display the default chart. To select a required chart type, click the Select Chart Type icon. On the Select Chart Type dialog box, select a chart type. Select the view properties that you want to apply to the chart (3D view, Show horizontal grid lines, Show vertical grid lines, and Show legend). Specify how these chart options are to be saved:

Select For current user only to have these chart settings override the reports standard settings whenever
the current user views this chart.

Select As report standard to have these chart settings presented to all users who dont have overriding
user settings defined. This setting does not affect individual user settings.
8

Click OK to display the new chart type. Note: Note: The chart configurations you define here become the defaults for this report. When standard charts and graphs are not able to deliver the specific data that you require, or when they cannot display data in a required format, you can customize the appearance of queried data using the Test Manager reporting functionality. To open the current chart in a separate browser window, click the Open in new window icon at the top of the Chart tab.

Note:

Related Concepts Report Generation Related Procedures Customizing BIRT Report Templates Displaying Charts Creating Reports Generating Reports Managing Reports Related Reference Report Chart tab

621

Accessing MRU (Most Recently Used) Reports


To select a recently-viewed report:
1 2 3

Click Reports on the workflow bar. Expand the Last Used Reports list box on the Reports toolbar. Select the report that you want to view.

Related Concepts Report Generation Related Procedures Managing Reports Related Reference Report tab Reports Toolbar Functions

622

Editing Report Parameters


To edit report parameters
1 2 3 4 5 6 7

Click Reports on the workflow bar. Select a report in the Reports tree. Click the Parameters tab. If the report has parameters defined for it, the parameters will be listed there. Click Edit Parameters. The Edit Parameters dialog box displays. Edit the Label or Value of the listed parameters as required. From the Usage field, select the usage type of the parameter (constant value, start time, end time). Click OK to save your changes.

Related Concepts Report Generation Related Procedures Creating New Reports Creating Reports Generating Reports Managing Reports Related Reference Report Parameters tab

623

Editing Report Properties


To edit the properties of a report
1 2 3 4 5 6 7

Click Reports on the workflow bar. Select the report in the Reports tree. On the Properties tab, click Edit. The Edit Report dialog box displays. Modify the Name and Description of the report as required. Ensure that the Share this report with other users check box is checked if you intend to have this report shared with other users. From the Default tab list box, select the tab that you want to be directed to when you select this report from one of the context-sensitive report lists. Specify one of the following options to indicate how the report can be edited:

Simple report: You can modify the Selection criteriathus changing the results of the selected report
or you can click Advanced Query to modify the SQL query code.

Advanced report: If you have familiarity with SQL, you may edit the query code in the Report data

query field. To assist you in editing SQL queries, a list box of function placeholders (for example, variables) is available. To insert one of the available pre-defined functions, select the corresponding placeholder from the Insert placeholder list box. Note: If you manually edit the SQL code for the query, upon finishing, click Check SQL to confirm your work.

Click Finish to save your changes.

Related Concepts Report Generation Related Procedures Creating New Reports Customizing BIRT Report Templates Creating Reports Generating Reports Managing Reports Related Reference Report Properties tab

624

Printing Charts
To print the current chart
1 2 3 4 5

Click Reports on the workflow bar. Select a report in the Reports tree. Click the Chart tab. Click Print at the top of the Chart tab. The chart data then displays in a new window in printable format. Your systems print dialog box is also displayed. Configure print settings as necessary and click OK to print the chart.

Related Concepts Report Generation Related Procedures Displaying Charts Creating Reports Generating Reports Managing Reports Related Reference Report Chart tab

625

Removing Charts
Removing a chart only removes the currently selected chart template from the selected report, it does not remove the chart template entirely.

To remove the current charts template


1 2 3 4 5

Click Reports on the workflow bar. Select a report in the Reports tree. Click the Chart tab. Click the charts Remove chart type button. On the Remove Chart dialog box, do the following:

Select Remove user settings (and revert to report standard) to have the current users chart settings

deleted along with the chart. The chart will subsequently be displayed according to the reports standard settings. If no standard settings have been defined, the chart cannot be displayed. Note that this option is only available when the current user has defined specific chart settings.

Select Remove standard chart settings of report to have any standard settings deleted along with the
chart. User-specific settings are not affected by this option. Note that this option is only available when standard chart settings have been defined for a report.

Click OK to delete the chart template. If required, you can click the <Click here to choose a chart type> link to assign a new chart template to the selected report.

Related Concepts Report Generation Related Procedures Displaying Charts Creating Reports Generating Reports Managing Reports Related Reference Report Chart tab

626

Working with Filters


This section explains how to work with custom filters in Test Manager. Filters enable you to quickly sort through test plan elements and execution definitions, highlighting only those elements that are relevant to your needs. You can create new custom filters, edit existing filters, or turn filtering off at the project level. Projects do not contain default filters. Filters can be accessed and edited from the Test Manager tool bar and through the Settings link on theTest Manager menu tree. In This Section Applying Filters Describes how to apply a custom filter to the selected tree. Creating Advanced Filters Describes how to create an advanced custom filter. Creating Filters How to create a new custom filter. Deleting Filters Describes how to delete an existing custom filter. Editing Filters Describes how to edit an existing custom filter.

627

Applying Filters
After you have created and stored a custom filter, you can apply that filter to the selected tree. Custom filters can be applied for requirements, test definitions and execution definitions. Only elements that meet applied filter criteria are displayed in the tree. Note: Filtered requirements are returned in read-only form and can not be edited. The Edit Properties button is disabled for filtered requirements.

To apply a stored filter


1 2 3

Click the appropriate button (Execution, Requirements, or Test Plan) on the Workflow Bar. Select the desired filter from the Filter list box on the toolbar. All elements that meet the filters criteria will then be displayed. Note: To remove filtering and display all elements, select <No Filter> from the Filter list box on the toolbar.

Related Concepts Filtering Related Procedures Working with Filters

628

Creating Advanced Filters


Advanced custom filters enable you to combine simple filters to create complex filters that apply multiple filter criteria simultaneously.

To create an advanced custom filter


1 2 3 4

Click the appropriate button (Execution, Requirements, or Test Plan) on the workflow bar. Create a new custom filter. After you have defined your first filtering rule, click Advanced to open the Edit Filter dialog box.

Enter a name for the filter in the Name field. 5 Give the filter a meaningful Description.
6 7 8 9

Click More to display a second set of filter-parameter fields with which you can define a second set of filter parameters. Select a logical operator for the application of the filtering queries. For example, filtered elements must meet both sets of criteria (and), or filtered elements must meet one, but not both, of the criteria sets (or). To delete a filter-parameter string, click the corresponding Delete button. To display additional filter-parameter fields and create additional filter queries, click More. To remove excess filter-parameter sets, click Fewer.

Related Concepts Filtering Related Procedures Creating Filters Creating Global Filters Working with Filters

629

Creating Filters
To create a new custom filter:
1 2 3 4

Click the appropriate button (Execution, Requirements, or Test Plan) on the workflow bar. Click New Filter on the toolbar to display the New Filter dialog box. From the Property list box, select the property on which you wish to base the new filter (for example, Name, Description, Priority, Version and Build). From the Operator list box, select a logical operator to be applied to the specified property (for example, =, not, >, >=, <, <=, contains, anddoes not contains). Note: The contents of the Operator and Value list boxes vary based on the attribute selected in the Property field.

In the Value field, enter the value that the specified property is to be compared against. Note: For date-based properties, the Value field is replaced with a calendar tool that you can use to select a specific date.

6 7 8 9

Click Save and apply to open the Edit Filter dialog box. To apply the filter to the current view without saving the filter settings, click Apply. On the Edit Filter dialog box, enter a name for the filter in the Name field. Enter a meaningful description for the filter in the Filter field. Click OK to save the filter with your project.

Related Concepts Filtering Related Procedures Creating Advanced Filters Creating Global Filters Working with Filters

630

Deleting Filters
To delete an existing custom filter
1 2 3

Click the appropriate button (Execution, Requirements, or Test Plan) on the workflow bar. Select the filter from the list box on the toolbar. Click Delete Filter.

Related Concepts Filtering Related Procedures Working with Filters Deleting Global Filters

631

Editing Filters
Existing filters are edited using the Edit Filter dialog box. The Edit Filter dialog box can be accessed both directly from the Toolbar (by clicking Edit Filter) and by clicking the Settings link on the menu tree).

To edit an existing custom filter


1 2 3 4

Navigate to Test Manager Click the Filters tab.

Settings.

Click the name of the filter you want to edit. Edit the filter by changing the selections defined for the filter.

Note: To remove custom filtering, click the appropriate button (Execution, Requirements, or Test Plan) on the workflow bar and select <No Filter> from the Filter list box on the toolbar. Related Concepts Filtering Related Procedures Working with Filters Editing Global Filters

632

Analyzing Code Coverage


This section explains how to perform code coverage analysis with Test Manager. In This Section Enabling Code Analysis for Execution Definitions How to enable code analysis for an execution definition. Generating Code-Change Impact Reports How to generate code-change impact reports. Viewing Code-Coverage Information for Packages How to view code-coverage information for packages. Enabling Code Analysis Within the Manual Testing Client How to enable code analysis for an execution definition from within the Manual Testing Client.

633

Enabling Code Analysis for Execution Definitions


To enable code analysis for an execution definition:
1 2 3 4 5 6

Click Execution on the workflow bar. Select an execution definition from the navigation tree. Select the Deployment tab. Click Edit in the Code Analysis Settings section of the tab. The Edit Code Analysis Settings dialog box displays. Check the Enable code analysis check box. In the Hostnames text box, enter a comma-separated list of hostnames (with port, if default port 19129 is not used) where code analysis information is to be gathered (for example, labmachine1, labmachine2:8000, 198.68.0.1). For each execution definition, you need to define the host names of the machine resources where the AUT is running. For example, with a client/server system, you must not only gather code coverage information on the client (which probably runs directly on an execution server) you must also gather data from the server (which likely runs on a different machine). This applies to all multi-tiered applications. Note: For JUnit code analysis runs, you do not need to specify a hostname.

Click OK to save your settings. Note: Once code analysis has been defined for an execution definition, each future run of that execution definition will gather code coverage information from the defined hostnames. While monitoring an execution from Test Managers Activities page, you will see that after gathering the sources for test definitions, Test Manager gathers full code coverage information before beginning test runs. The Code Coverage Controller, which is integrated into each Test Manager execution server, controls all defined hosts during execution runs. For each test definition of an execution definition, the controller starts and stops all associated instances, collects XML-based code coverage files for the test definition, and merges the results into a single file. The test definition then saves the merged code coverage file to its execution results.

Related Concepts Test Manager Code Analysis Code Coverage Analysis Related Procedures Analyzing Test Results - Quick Start Task Analyzing Code Coverage Related Reference Execution Deployment tab Code Analysis Unit Interface

634

Generating Code-Change Impact Reports


To generate a code-change impact report
1 2 3 4 5

Click Projects on the workflow bar. Select the project for which you want to analyze code-coverage data. Click Code Analysis on the workflow bar. Click Create Code Change Impact Report on the main toolbar. The Select Classes for Report dialog box displays, select a Product and Version, if you want to change the pre-selected values. In the Filter field, enter criteria to filter the packages. For example, entering the string published will only list packages that contain the string published in their names. Select a package from the Packages pick list. You can select multiple packages by holding down the CTRL key while clicking listed packages. The classes that are available in the selected package appear in the Classes pick list. Select a class file that you want to have included as a source in your report. You can select multiple classes by holding down the CTRL key while clicking listed classes. Click Add to add the class file(s) to the Selected classes pick list. You can remove classes in the Selected classes pick list by selecting entries and clicking Remove. Click Remove All to remove all selected classes from the Selected classes pick list.

6 7

Repeat the preceding steps Select a package from the Packages pick list through Click Add to add the class file(s) to the Selected classes pick list until you have added all required classes to the Selected classes list. Select a report from the Select report list box.

Related Concepts Code-Change Impact Reports Report Generation Test Manager Code Analysis Code Coverage Analysis Related Procedures Analyzing Test Results - Quick Start Task Enabling Code Analysis for Execution Definitions Analyzing Code Coverage Related Reference Code Analysis Unit Interface

635

Viewing Code-Coverage Information for Packages


To view code-coverage information for a package:
1 2 3 4 5 6 7

Click Projects on the workflow bar. Select the project for which you want to view code-coverage information. Click Code Analysis to go to the Code Analysis unit. Expand the project node in the navigation tree to display the products that are available for the selected project. Expand a product node to display the versions that are available for that product. Expand a version node to display the builds that are available for that version. Select a specific build. Code coverage information for that build then displays on the Details tab. Note: To view code-analysis information for all products, including the products that have been created for the selected product, click Show all products on the main toolbar. Products of other projects are then listed under the Other Projects node.

Related Concepts Latest Builds and Build Versions Test Manager Code Analysis Code Coverage Analysis Related Procedures Analyzing Test Results - Quick Start Task Enabling Code Analysis for Execution Definitions Analyzing Code Coverage Related Reference Code Analysis Details tab

636

Enabling Code Analysis Within the Manual Testing Client


To enable code analysis for an execution definition from within the Manual Testing Client
1 2

From within the Manual Testing Client, navigate to Edit

Edit Code Analysis Settings.

On the Edit Code Analysis Settings dialog box, proceed with enabling code analysis for the execution definition. Note: After code analysis is enabled, you can execute your test definitions in the Manual Testing Client. However, you need to click Code Analysis: Start on the Execute Test dialog box before you actually start testing. This way Test Manager will collect code analysis information while you execute the manual test. When you are done testing, click Stop to halt the collection of code analysis information.

Related Concepts Test Manager Code Analysis Manual Test Definitions Code Coverage Analysis Related Procedures Analyzing Test Results - Quick Start Task Enabling Code Analysis for Execution Definitions Executing Manual Tests Analyzing Code Coverage Related Reference Execution Deployment tab Code Analysis Unit Interface

637

638

639

Reference
This section contains all of the reference topics provided with SilkCentral Test Manager. In This Section User Interface Reference This section contains information about Test Manager's user interface elements. General Reference This section contains general reference topics provided with SilkCentral Test Manager. APIs Refer to the Test Manager API Help for full details regarding Test Manager's APIs. Database Schemas Refer to the Test Manager Database Model for full details regarding Test Manager's database schema.

640

User Interface Reference


This section contains information about Test Manager's user interface elements. In This Section Projects Unit Interface This section contains information about the user interface elements in Test Manager's Projects unit. Settings Unit Interface This section contains information about the user interface elements in Test Manager's Settings unit. Requirements Unit Interface This section contains information about the user interface elements in Test Manager's Requirements unit. Test Plan Unit Interface This section contains information about the user interface elements in Test Manager's Test Plan unit. Execution Unit Interface This section contains information about the user interface elements in Test Manager's Execution unit. Code Analysis Unit Interface This section contains information about the user interface elements in Test Manager's Code Analysis unit. Issues Unit Interface This section contains information about the user interface elements in Test Manager's Issues unit. Reports Unit Interface This section contains information about the user interface elements in Test Manager's Reports unit.

641

Projects Unit Interface


This section contains information about the user interface elements in Test Manager's Projects unit. In This Section Projects tab The Projects tab lists all the projects associated with yourTest Manager installation and vital statistics for each project. Overview tab The Overview tab displays the Project Overview Report, which offers a high-level overview of the status of the selected project. Activities Page The Activities page offers a centralized location from which you can manage upcoming, current, and recently executed test runs on a per-project basis. Cross-Project Activities Page The Cross-Project Activities page allows a user with SuperUser privileges to see all execution related activities across projects. Test Definition Run Results Dialog The Test Definition Run Results dialog lists run details of a test definition.

642

Projects tab
Test Manager Projects Projects The Projects tab lists all the projects associated with your Test Manager installation and vital statistics for each project, including Description and Created By/Created On details. The Projects tab enables you to select the projects that you want to use. Note: Only your system administrator has rights to set up new projects. Project Description Created On Created By Name of the project. Project description. Date the project was created User who created the project.

Related Concepts Managing Projects Related Procedures Project Management

643

Overview tab
Test Manager Projects Overview The Overview tab displays the Project Overview Report, which offers a high-level overview of the status of the selected project. The Project Overview Report shows the following information: General Report Information General information like the name of the current project, the report description, and the planned release date. Requirements Test coverage status for the Requirements unit. Shows the percentage of test coverage in tabular and graph format for the following: Total All requirements. High priority Requirements with high priority. Test-type distribution and test-execution results in chart and tabellary form for the Test Plan unit. The following numbers are shown in tabular and graph format for all issues that are linked to test definitions in the test plan tree: Find Number of found issues. The number of found issues in a period is the number of all fixed, all opened, and all deferred issues. Fixed Number of issues that are fixed. An issue is counted as fixed in a period each time it's status changes from "open" or "deferred" to "fixed", "verified", or "closed". Issues that are defined as "no longer an issue" are not counted as fixed. If an issue is fixed, opened again, and fixed again in the same period, it is counted twice as fixed. Deferred Number of issues that are defined not to be fixed in the current release, but in a future release. Open Backlog All issues that are currently open. If an issue is opened, fixed, and opened again in the same period, it is counted twice as opened.

Test Plan Issues

Related Concepts Project Overview Report Code Coverage Analysis Report Generation Related Procedures Analyzing Code Coverage Managing Reports

644

Activities Page
Test Manager Projects Activities The Activities page offers a centralized location from which you can manage upcoming, current, and recently executed test runs on a per-project basis. The grid views on the Activities page offer view settings (resizing and reordering of columns), filtering, sorting, and grouping options that are configurable on a per-user basis. You can display/hide columns, adjust the width of columns, and move columns around using drag-and-drop. The Activities page is split into three sections: Next Executions, Current Executions, and Last Executions. The grid views can be resized by dragging and dropping the separators between the views. Context-sensitive menu commands are available for each test run. These commands enable you to link directly to listed execution definitions, continue manual tests, manage test-run results, and more. The Activities page makes it easier to identify match points between execution definitions and to find specific execution-definition information. Standard Windows keyboard shortcuts can be used to select test run entries, making it easy to select and manipulate specific sets of execution definitions and test results. Sorting, grouping, and filtering functions are available through context-menu commands to help you better organize and group test runs. All of your view-customization preferences are saved along with your project and will be available to you each time you visit the Activities page. Note: Data on the Activities page is not automatically refreshed. Click Reload near the paging buttons at the bottom of each view to refresh the entire page's contents. Note: You can use your keyboard's CTRL and SHIFT keys to select multiple queued executions and abort them all with one click.
Item Description

Next Executions

To enhance performance when you have numerous execution definitions, only the upcoming 50 execution definitions that are scheduled to run are displayed in the Next Executions view (future execution definitions can however be accessed using the available filtering features). To edit an execution definition listed in the Next Executions section, right-click the execution definition and choose Go to Execution Definition or click on the name of the execution definition; this takes you to the Executions unit where you can view and edit the details of the execution definition. By default, all execution definitions are sorted by Start Time. Columns in the Next Executions view can not be sorted or grouped. Next Executions view can be collapsed/expanded by clicking the double-arrow button on the right-hand side of the view's title bar.

ID of the scheduled execution definition. This column is hidden by default. Execution Definition/Folder Name of the scheduled execution definition or folder. Keywords Keywords that are assigned to the scheduled execution definition. Manual Testers (manual tests only) The user names of the people who are assigned to perform the manual test. This field is blank when no manual testers are assigned to the test. Priority The priority that has been assigned to the execution definition. Start Time Scheduled start time of the test run. Current Executions The Current Executions view lists the execution definitions that are currently running (both automated and manual test runs).

ID

645

To abort an execution definition that is currently in progress, click Abort in the Actions column of the execution definition. Right-click an execution definition and choose Go to Execution Definition or click on the name of the execution definition to view or edit the execution definition. Right-click an automated execution definition and choose View Details (or click the execution definition's Run ID/Task ID link) to view the execution's progress. As long as a manual test remains open, the corresponding execution definition remains in the list of Current Executions with a status of Pending. Right-click a manual test execution definition and choose Continue Manual Test (or click the execution definition's Continue Manual Test button in the Actions column) to continue a manual test in Manual Test Execution view. Right-click a manual execution definition and choose View Details (or click the execution definition's Run ID/Task ID link) to go to the Results for Execution Definition page for that execution definition. From there, you can click the name of a manual test definition in the Assigned Test Definitions portion of the dialog box to open the Results dialog box for the manual test definition; detailed results of the manual test definition are displayed there. Back on the Results for Execution Definition page, click Manual Test Results to go to Manual Test Execution view, where read-only information about the status of the assigned manual test definition is available. Page views of current executions are broken into views of 20 execution definitions each. You can advance through pages using First, Last, Next, and Previous located in the lower part of the Current Executions view. Or you can enter a page number into the Page text box and hit the ENTER key. ID Execution Definition Run ID/Task ID ID of the current execution definition. This column is hidden by default. Name of the current execution definition. Manual-test executions receive a run ID when they are executed. Upon test completion, this run ID carries over to Last Executions view. Automated-test executions receive a task ID when they are executed. Task IDs are however not carried over to Last Executions view. Completed automated tests receive a run ID in Last Executions view. Status of the active execution definition or manual test. For automated tests, status is indicated with a textbased value (Pending or Active). For manual tests, status is indicated with a colored histogram.

Status

Automated-test statuses are described textually and can be filtered. Manual tests can be filtered by checking relevant properties on the Filter submenu (Pending manual execution, Pending manual setup execution and Pending manual cleanup execution). Keywords Keywords that are assigned to the current execution definition. Executed By (manual tests only) The users who are assigned to perform the manual test. This field is blank when no manual testers are assigned to the test. Priority Priority that has been assigned to the current execution definition or manual test (Low, Normal, or High).

646

Start Time Time Left

Start Type Starter Name Start Scope Actions

Time when the execution definition or manual test was executed. Amount of time remaining until the test is complete. For manual tests that do not have an estimated time, this column has a value of unknown. Shows how the test run was started. Manually, through a Web Service, or from a schedule. Name of the schedule, tester, or Web Service user. The scope specified in the Run dialog box. Actions that you can perform on the execution definition. Abort

Last Executions

Click to cancel the current execution. Alternatively, use the DELETE key on your keybord to abort test runs. When you abort executions, these executions are grayed out until the background process completes the deletion. Continue Manual Test Click to go to Manual Test view and execute the test. Manual Testing Client Click to open the Manual Testing Client. This button is available to users that are assigned as testers to the selected test. The Last Executions view lists all past execution definition runs, except deleted runs, for which results were collected from the execution server. You can filter the listed execution definition runs, for example by the start time. To view or edit an execution definition, right-click the execution definition and choose Go to Execution Definition, or click on the name of the execution definition. Right-click an execution definition run and choose View Details, or click the execution definition's Run ID link, to display the run's Results for Execution Definition page. This page shows details for the selected execution definition run and includes any files and messages, for example LiveLink VMware configuration captures, that were generated during the execution. Click on the Run ID of a test definition in the Assigned Test Definitions portion of the Results for Execution Definition page to access the test definition's Results dialog box. To compare two execution definition runs, use your keyboard's CTRL and SHIFT keys to select the two runs. Right click on your selection and click Reports Execution Definition Run Comparison .... For execution definitions that are deployed to virtual servers: To open VMware Lab Manager and restore a captured LiveLink configuration, expand the Messages link on an execution definition run's Results for Execution Definition page and select LiveLink. To delete a test run, right-click a run entry and choose Delete Results (or click the run's Delete button in the Actions column). Test-result page views are broken into views of 20 test results each. You can advance through pages using First, Last, Next, and Previous at the bottom of the Last

647

Executions view. Or you can enter a page number into the Page text box and hit the ENTER key. Last Executions view can be collapsed/expanded by clicking the double-arrow button on the right-hand side of the view's title bar. ID ID number assigned to the executed execution definition. Note that unassigned test definitions have an ID value of N/A. This column is hidden by default. Execution Definition Name of the executed execution definition. Click on the name to view or edit the execution definition. Run ID ID assigned to the test run. Click the link to view details of the test run. Status Result status of the test run (Passed, Failed, or Not Executed). Note that filtering, sorting, and grouping are not available for the Status column in Last Executions view. Keywords Keywords that were assigned to the execution definition at execution time. Executed By The information given in this column responds to the type of the test.
Test Type Column Description

Start Time Duration Product Version Build Actions

Manual Test The names of the testers who executed the test. Automated Test The name of the execution server that ran the test. Note that sorting and grouping are not available on the Executed By column in the Last Executions view. This view is sorted by Start Time. When the test run began. The duration of the test run. The product under test. This column is hidden by default. The version of the product under test. This column is hidden by default. The build number of the product under test. Actions that you can perform on the execution definition. Click to delete the test run results. When you delete executions, these executions are grayed out until the background process completes the deletion. Alternatively, use the DELETE key on your keybord to delete test definitions. View Manual Test Results Click to view the Current Run page in read-only mode; or right-click a manual execution definition run and click View Manual Test Results. Shows how the test run was started. Manually, through a Web Service, or from a schedule. Name of the schedule, tester, or Web Service user. The scope specified in the Run dialog box. Delete

Start Type Starter Name Start Scope

648

Related Concepts Project Management Test Definition Execution VMware Lab Manager Virtual Configurations Execution Definition Run Comparison Reports Related Procedures Managing Activities Managing Projects Executing Test Definitions Assigning Keywords to Execution Definitions Related Reference Cross-Project Activities Page Test Definition Run Results Dialog Execution Definition Run Results Dialog

649

Cross-Project Activities Page


Test Manager Projects X-Project Activities The Cross Project Activities (X-Project Activities) page allows a user with SuperUser privileges to see all execution related activities across projects. It provides all the options the Activities page offers, and additional options across projects. This allows some conclusion about the execution queue. The SuperUser can remove executions from the queue to resolve bottlenecks. The X-Project Activities tab is visible to the SuperUser only. The tab is split into the same three sections as the Activities tab: Next Executions, Current Executions, and Last Executions. An additional column exists in all three sections. This column is named Project ID.
Item Description

Project ID The ID of the project the execution definition belongs to. Related Concepts Project Management Test Definition Execution VMware Lab Manager Virtual Configurations Related Procedures Managing Activities Managing Projects Executing Test Definitions Assigning Keywords to Execution Definitions Related Reference Activities Page Test Definition Run Results Dialog

650

Test Definition Run Results Dialog


Test Manager Execution Runs The Test Definition Run Results dialog lists run details of a test definition. The Test Definition Run Results dialog can be accessed from the following locations within Test Manager:

Test Manager Test Manager Test Manager


definition>
Tab

Test Plan Execution Projects

Runs Runs Activities

Run ID <assigned test definition> <execution definition in Last Executions> <assigned test

Description

Details

Shows the details of the test definition run, including its Duration, Execution Path, the Execution Definition Run ID of the execution definition run that included the test definition run, and any Warnings/Errors. This tab also allows you to change the status of the test definition run. This option is useful if you need to manually overrule the status of a test run. Check the Hide Passed check box below the Assigned Test Definitions in the Execution Definition Run Results Dialog to show all test definitions. The default setting shows only the not passed test definitions to enhance performance. Only a part of the total test definitions have to be displayed. Additionally the information presented is of more use to the viewer. All parent nodes are displayed with the full status information. When a manual status change is performed, the details of the change are reflected in this tab's Status, Status Changed On, Status Changed By, Previous Status, and Status Change Comment fields. Only displayed for SilkTest, SilkPerformer, and manual test definitions. This tab includes details that are specific to the selected test definition type. For example, when a SilkTest test definition is selected, this view includes the selected test case, test data, and any warnings that were displayed during the test run. Lists all files that were generated by this test run, along with file sizes. The names of SilkTest .rex files act as download links. Once downloaded, these files can be viewed directly in a text editor. The upper table lists files that are associated with the test definition, such as result files or manually uploaded files for manual test definitions. The lower table lists files that are associated with the execution definition, for example execution log files or code analysis results. This tab also contains a button to download all result files: Download All Files Download all result files generated by the test definition run as a zipped package. Lists all messages that were generated by this test run, along with the severity of the messages.

Specific

Files

Messages

Messages that are associated with an execution definition as a whole, and not to one of the individual test definitions, can be viewed in the Projects unit (Activities tab/ Messages tab). Success Conditions Only displayed for automated test definitions. This tab shows all the success conditions that were defined for the test during the test planning process (Test Plan unit, Properties tab) and the result values from the execution run. Success conditions are used to determine if a test is successful or if it has failed. Data Driven Only displayed for data-driven test definitions using the option of having a single test definition for all data rows of the data set. This tab lists the status of each instance (data

651

Attributes Parameters

row) run of the test definition. Clicking an instance brings up another instance of the Test Definition Run Results dialog with run details of the selected instance. Any attributes that have been configured for the test definition. Any parameters that have been configured for the test definition.

The following table lists the UI elements that are used to step through the test definition results of an execution run. These elements are only visible when accessing the Test Definition Run Results dialog from an execution definition.
Item Description

Skip Passed

Used to determine which test definition run results should be displayed when browsing using the Previous Result and Next Result buttons. Checking this option only displays test definitions with a status other than Passed. < Previous Result Jumps to the result details of the previous test definition in the selected execution definition run. Next Result > Jumps to the result details of the next test definition in the selected execution definition run. Related Concepts Execution Dependency Configuration Related Procedures Viewing Test Execution Details Configuring Execution Dependencies Related Reference Activities Page Execution Runs Tab Test Plan Runs tab Execution Definition Run Results Dialog

652

Settings Unit Interface


This section contains information about the user interface elements in Test Manager's Settings unit. In This Section Project Settings tab The Project Settings tab lists high-level details about the active project. Filters tab The Filters tab lists the filters that are available to the active project. Attributes tab The Attributes tab lists the attributes that have been created for the current project. Requirement Properties Page The Requirement Properties page lists the custom requirement properties that are available to the active project. Step Properties Page The Step Properties page lists all custom properties that can be populated into manual test steps across the active project. Notifications Page The Notifications page lists the notification event types that have been configured for the active project. Integrations Configuration tab The Integrations Configuration tab lists the requirements-management integrations that have been configured for the current project. Data Sources Configuration Page Use this page to configure data sources for data-driven tests in SilkCentral Test Manager. Issue Tracking Profiles Page Use this page to configure profiles for the integration of external issue tracking systems into SilkCentral Test Manager. Source Control Profiles Page Use this page to configure profiles to integrate external source control systems with SilkCentral Test Manager.

653

Project Settings tab


Test Manager Settings Project Settings The Project Settings tab lists high-level details about the active project.
Item Description

Build information file name

Build information files contain project information, including build number, build log location, error log location, and build location. Enter the name of the active projects build information file in this field. All test executions will read the build information from this specified file. Project release date Scheduled release date of the active project (MM/DD/YYYY). File extensions to ignore in results Result file types or other file types that should not be saved as results for test executions. Related Concepts Settings Configuration Related Procedures Configuring Project Settings

654

Filters tab
Test Manager Settings Filters The Filters tab lists the filters that are available to the active project.
Item Description

Name Type Created On Created By Changed On Changed By Actions

Name of the filter. Filter category (requirement, test plan, or execution). When the filter was created. User who created the filter. When the filter was most recently modified. User who most recently modified the filter. Actions that can be performed on the filter (Delete).

Related Concepts Global Filters Related Procedures Configuring Global Filters Settings Configuration

655

Attributes tab
Test Manager Settings Attributes The Attributes tab lists the attributes that have been created for the current project.
Item Description

Name

Name of the attribute. This name is displayed in the following list boxes: Filters: Attributes can be used in global filters for filtering by test definition attributes (see Global Filters). Test Plan unit: Attributes can be applied to test definitions. (see Understanding Test Definition Attributes ). Attribute type. The following attribute types are available: Edit: An attribute of type edit is an alphanumeric field, in which you can enter any string (e.g., Comment). Normal: Attributes of type normal require you to define a list of values. When applying an attribute of type normal to a test definition, you can only select one value from the list (e.g., Priority, with defined values high, medium, and low).

Type

Set: Attributes of type set require you to define a list of values. When applying an attribute of type set to a test definition, you can select zero, one, or multiple values from the list (e.g., Test Scenario, with defined values load test, regression test, smoke test). Status Status of the attribute (Active or Inactive) Column The column name of the attribute in the LQM Reporting table. Use this column name to query the selected attribute within the LQM Reporting table. See the database model documentation for detailed information. Created On When the attribute was created. Created By User who created the attribute. Changed On When the attribute was last modified. Changed By User who most recently modified the attribute. Actions Available actions that can be performed on the attribute (Delete). Related Concepts Attributes Related Procedures Configuring Custom Attributes

656

Requirement Properties Page


Test Manager Settings Requirement Properties The Requirement Properties page lists the custom requirement properties that are available to the active project.
Item Description

Name Type Status Created On Created By Changed On Changed By Actions

Name of the custom requirement property. Property type. The following types are available: String, Integer, Boolean, and Date. Status of the property (Active or Inactive). When the property was created. User who created the property. When the property was last modified. User who last modified the property. Available actions that can be performed on the property (Delete).

Related Concepts Custom Requirement Properties External Requirements Management Tools Related Procedures Customizing Requirement Properties Integrating External RM Tools Requirements Management

657

Step Properties Page


Test Manager Settings Step Properties The Step Properties page lists all custom properties that can be populated into manual test steps across the active project.
Item Description

Name Name of the custom step property. Actions The actions that can be performed on the property are Delete, Move Up, and Move Down. Related Concepts Custom Step Properties Manual Test Definitions Related Procedures Configuring Custom Step Properties Executing Manual Tests

658

Notifications Page
Test Manager Settings Notification The Notifications page lists the notification event types that have been configured for the active project.
Item Description

Notification Events Name of notification event that has been set up for the active project. Status Status of the notification event (Active or Inactive). When a notification event is activated, a notification email is sent to the user that activated the event, the first time one of the specified settings is changed. See Change-Notification Emails for a list of the changes that trigger a notification email. Note: The user must have specified an email address to be able to receive email notifications. If you want to specify an email address for a user, refer to the SilkCentral Administration Module documentation for help. Related Concepts Change Notification Change-Notification Emails Related Procedures Configuring Change Notification

659

Integrations Configuration tab


Test Manager Settings Integrations Configuration The Integrations Configuration tab lists the requirements-management integrations that have been configured for the current project.
Item Description

Borland CaliberRM Integration

This section lists details related to the integration of the Borland CaliberRM requirements management system. Note that if integration has not been enabled, you will only see the Status property. Status Status of integration (Enabled or Disabled). Hostname Machine where the external server is installed Username Credential for the requirements management server. Password Credential for the requirements management server. Project External project with which the Test Manager project is integrated. Requirement Types Requirement types within the project that are integrated. Create Requirements Indicates whether or not the Enable creation of unassigned requirements option is active. Enables creation and editing of unmapped requirements in Test Manager projects that are configured for integration with CaliberRM. Upload Requirements Indicates whether or not the Enable upload of requirements to Borland CaliberRM option is active. Enables the upload of unmapped/unassigned requirements from Test Manager to CaliberRM. This allows you to upload additional previously unmapped requirement trees to CaliberRM and then have those requirements mapped within Test Manager. When this option is enabled, the Map Requirement button becomes enabled (Requirements Properties), enabling configuration of top level requirements for external requirement types, which is required when uploading unmapped requirements. Property Mappings Lists any external/internal property mappings that have been defined between the internal and external requirements management systems. IBM Rational RequisitePro Integration This section lists details related to the integration of the IBM Rational RequisitePro requirements management system. Note that if integration has not been enabled, you will only see the Status property. Status Status of integration (Enabled or Disabled). UNC Project Path Machine where the external server is installed UNC Username Credential for the requirements management server. UNC Password Credential for the requirements management server. User name Credential for the requirements management server. Password Credential for the requirements management server. Packages The requirement packages from the external project that are integrated with the Test Manager project Requirement Types Requirement types within the packages that are integrated. Create Requirements Indicates whether or not the Enable creation of unassigned requirements option is active. Enables creation and editing of unmapped requirements in Test Manager projects that are configured for integration with Rational RequisitePro. Upload Requirements Indicates whether or not the Enable upload of requirements to RequisitePro option is active. Enables the upload of unmapped/

660

Property Mappings

Telelogic DOORS Integration

Status RM Service URL

Username Password DOORS Installation Path Project Name Requirement Types Schedule Create Requirements

Upload Requirements

Property Mappings

unassigned requirements from Test Manager to RequisitePro. This allows you to upload additional previously unmapped requirement trees to RequisitePro and then have those requirements mapped within Test Manager. When this option is enabled, the Map Requirement button becomes enabled (Requirements Properties), enabling configuration of top level requirements for external requirement types, which is required when uploading unmapped requirements. Lists any external/internal property mappings that have been defined between the internal and external requirements management systems. This section lists details related to the integration of the Telelogic DOORS requirements management system. Note that if integration has not been enabled, you will only see the Status property. Status of integration (Enabled or Disabled). The URL of Test Manager's Telelogic DOORS requirement Web Service. The default value should point to the correct location already. Credential for the requirements management server. Credential for the requirements management server. Client installation path within the front-end server directory structure. External project with which the Test Manager project is synchronized. Requirement types within the project that are synchronized. Any defined synchronization schedule. Indicates whether or not the Enable creation of unassigned requirements option is active. Enables creation and editing of unmapped requirements in Test Manager projects that are configured for integration with DOORS. Indicates whether or not the Enable upload of requirements to Telelogic DOORS option is active. Enables the upload of unmapped/unassigned requirements from Test Manager to DOORS. This allows you to upload additional previously unmapped requirement trees to DOORS and then have those requirements mapped within Test Manager. When this option is enabled, the Map Requirement button becomes enabled (Requirements Properties), enabling configuration of top level requirements for external requirement types, which is required when uploading unmapped requirements. Lists any external/internal property mappings that have been defined between the internal and external requirements management systems.

Related Concepts Requirements Integration Configuration Related Procedures Integrating External RM Tools

661

Data Sources Configuration Page


Test Manager Settings Data Sources Use this page to configure data sources for data-driven tests in SilkCentral Test Manager.
Item Description

Name Type Created On Created By Changed On Changed By Actions

The name of the data source as it displays in the SilkCentral GUI and in reports. Click the name of a data source to modify the data source settings. Data source type (CSV, JDBC, MS Excel). Date when the data source was created. User who created the data source. Date when the data source was last modified. User who last modified the data source. This column contains action icons which allow the user to perform the following actions on a data source: Delete

Deletes the data source permanently. Deletion is not allowed if a data source is already associated with test definitions. Download Downloads the data source to your local computer. Upload Replaces the currently uploaded data source with the newly uploaded data source. Synchronize Updates all test definitions which are associated with the data source with the latest data. New Data Source Click this button to create a new data source. Related Concepts Data Sources for Data-Driven Tests Related Procedures Configuring Data Sources for Data-Driven Tests

662

Issue Tracking Profiles Page


Test Manager Settings Issue Tracking Use this page to configure profiles for the integration of external issue tracking systems into SilkCentral Test Manager.
Item Description

Name

The name of the profile as it displays in the SilkCentral GUI and in reports. Click the name to edit a profile. Type The external issue tracking system. See the related Issue Tracking Profiles topic for detailed information on available issue tracking system integrations. Login The login name with which SilkCentral connects to the issue tracking system. Repository Info Physical location of the issue tracking system (hostname or URL). Created On Date when the issue tracking profile was created. Created By The user who created the issue tracking profile. Actions This column contains action icons which allow the user to perform the following actions on an issue tracking profile: Edit Mapping Edit the mapping of issue states between Test Manager and the external issue tracking system. Delete Deletes the issue tracking profile permanently. Profiles can not be deleted if external issues have been entered for test definitions. Click this button to create a new issue tracking profile.

New Profile

Related Concepts Issue Tracking Profiles Related Procedures Configuring Issue Tracking Profiles Managing SilkCentral Issue Manager Issue Tracking Profiles Managing Borland StarTeam Issue Tracking Profiles Managing IBM Rational ClearQuest Issue Tracking Profiles Managing Bugzilla Issue Tracking Profiles Mapping Issue States

663

Source Control Profiles Page


Test Manager Settings Source Control Use this page to configure profiles to integrate external source control systems with SilkCentral Test Manager.
Item Description

Name

The name of the profile as it displays in the SilkCentral GUI and in reports. Click the name to edit a profile. Type The external source control system. See the related Source Control Profiles topic for detailed information on available source control system integrations. Working Folder Local or mapped working folder to which temporary sources are checked out to. Created On Date when the source control profile was created. Created By The user who created the source control profile. Changed On Date when the source control profile was last modified. Changed By The user who last modified the source control profile. Actions This column contains action icons which allow the user to perform the following actions on a source control profile: New Profile Delete Deletes a source control profile permanently. Click this button to create a new source control profile.

Related Concepts Source Control Profiles Related Procedures Configuring Source Control Profiles

664

Requirements Unit Interface


This section contains information about the user interface elements in Test Manager's Requirements unit. In This Section Requirements Document View Document View displays the status of all tests that have been assigned to the active project in a heat field chart. Requirements Toolbar Functions The Requirements toolbar provides important commands for managing your requirements. Requirement Properties tab The Properties tab offers high-level information about the selected requirement or project. Requirement Attachments tab Attached files and links for selected requirements are listed on the Attachments tab. Assigned Test Definitions tab The Assigned Test Definitions tab lists all test definitions that have been assigned to the selected requirement. Requirement Coverage tab The Coverage tab shows the status of all tests that have been assigned to the requirement. Requirement History tab The History tab details the revision history of the selected requirement or project.

665

Requirements Document View


Test Manager Requirements Click Document View to access Document View. Use Document View to display the status of all tests that have been assigned to the active project in a heat field chart. Document View displays the status of all tests that have been assigned to active project (number and percentage of Passed, Failed, Not Executed, and Not Covered tests). Document View displays this coverage status information in a heat field chart, with green indicating passed tests; red indicating failed tests; brown indicating tests that have not yet been executed; and gray indicating tests of other status. Requirements that are not covered by test definitions are listed as Not covered. Note: Test definition totals accumulate to the parent level (for example, requirement totals include test definitions from child requirements; project totals include test definitions from all requirements. Related Concepts Requirements Management Requirement Coverage tab Full Coverage and Direct Coverage Modes Requirements Reports Related Procedures Assigning Test Definitions to Requirements Manually Switching Between Full and Direct Coverage Modes

666

Requirements Toolbar Functions


Test Manager Requirements The Requirements toolbar provides important commands for managing your requirements. Note: Note that these commands are also available through context menus in the Requirements tree.
Item Description

Document View, Requirements View Toggles between Document View, which lets you view select properties of all requirements in a single view, and Requirements View, which enables you to drill deeply into the properties of a single requirement. New Requirement Enables you to add a new requirement to the active project. New Child Requirement Enables you to add a new requirement to the active project that will be a child of the selected requirement. Edit Enables you to open the selected requirement for editing. Delete Identifies the selected requirement in the tree as obsolete. The requirement will subsequently be displayed in italics if you right-click the project node and select the Show Obsolete Requirements command. Check the Destroy permanently check box on the Delete Requirement dialog box to permanently delete a requirement from the system. Cut, Copy, Paste Cut, copy, and paste of requirement elements between the Requirements Tree and the clipboard. Paste as Child Pastes a copy of the requirement held on the clipboard to the child level beneath the currently selected requirement. Move Up, Move Down Move requirements up or down within the Requirements tree view. Find/Replace Find enables you to search through all requirements in the active project based on configurable parameters. Replace enables you to optionally replace instances of found values with a new value. Filtering commands Requirements view filtering options. Show changes/Acknowledge changes Show recent changes and acknowledge changes to the Requirements tree. Show Direct/Full Coverage Toggles between full and direct coverage modes. Full Ccoverage mode offers a cumulative view of test-definition-to-requirement coverage that considers the status of all child requirements of parent requirements. In direct coverage mode. requirement status is calculated only by considering the test definitions that are assigned directly to requirements. Related Concepts Requirements Unit Interface Requirements Management Related Procedures Managing Requirements

667

Requirement Properties tab


Test Manager Requirements Properties The Properties tab offers high-level information about the selected requirement or project. When requirements are synchronized with an external requirements management system, items are sometimes marked with an exclamation mark (!). This means that the marked field is not mapped to the external RMS. Item values that are marked with an asterisk (*) are values that are inherited from the parent requirement.
Item Description

Requirement Name Requirement ID Description Priority Risk Reviewed Custom Properties Document Created On Created By Changed On Changed By Related Concepts

Name of the requirement. Identifier of the requirement. Meaningful description of the requirement. Priority that has been configured for this requirement. Risk that has been configured for this requirement. Review status that has been configured for this requirement. Custom properties that have been configured for this requirement. Source document (if any) from which this requirement was derived. Date on which this requirement was created. Name of the user who created this requirement. Date on which this requirement was last updated. Name of the user who last updated this requirement.

Requirements Management Requirements Reports Related Procedures Creating Requirements Managing Requirements

668

Requirement Attachments tab


Test Manager Requirements Attachments Attached files and links for selected requirements are listed on the Attachments tab. Attachments are displayed in the order in which they are uploaded, though the list of attachments can be sorted by Name, Created On, and Created By properties. Note: To display any attachments that are associated with child requirements of the selected requirement, check the Include Child Attachments check box. Note: File icons indicate whether documents are directly attached to the selected requirement, or whether they are attached to a child requirement of the selected requirement (only displayed if Include Child Attachments is checked) Single icon: File is directly attached to selected requirement. Double icon: File is attached to a child requirement of the selected requirement.
Item Description

Name Size Description Created On Created By Actions

Name of the attachment. Size of the attachment. Description of the attachment. When the attachment was created. User who created the attachment. Actions that can be performed on the attachment (Edit and Delete).

Related Concepts Attachments Related Procedures Managing Requirement Attachments

669

Assigned Test Definitions tab


Test Manager Requirements Assigned Test Definitions The Assigned Test Definitions tab lists all test definitions that have been assigned to the selected requirement. Each test type is indicated with a different icon (manual, SilkTest, SilkPerformer, 3rd party). All additional test definitions that are available for assignment are displayed in the right-hand Available Test Definitions window. Note: Newly generated test definitions can automatically be assigned to the requirements from which they are generated by selecting the Assign newly generated test definitions to requirements on the Generate Test Plans from Requirements dialog box (the default behavior). Note: The Assigned Test Definitions tab offers two viewing options: The Full Coverage check box, which additionally displays all test definitions that have been assigned to child requirements of the selected requirement (default view displays only those test definitions that are assigned directly to the selected requirement).
Item Description

Assign Saved Selection Test Definition Status Last Execution Issues Actions Related Concepts

Click to assign a selection of test definitions from the Grid View. Name of the assigned test definition. Click to view and edit the test definition. Status of the assigned test definition. (Passed, Failed, Not Executed) When the test definition was last executed. Issues (if any) that are associated with this test definition. Actions that can be taken against this test definition (Delete and Locate).

Full Coverage and Direct Coverage Modes Test Plan Generation Related Procedures Assigning Test Definitions to Requirements Manually Assigning Test Definitions from Grid View to Requirements Removing Test Definition Assignments Locating Assigned Test Definitions in the Test Plan Tree Sorting the Assigned Test Definitions Tab Requirements Management

670

Requirement Coverage tab


Test Manager Requirements Coverage The Coverage tab (available in Requirements View only) displays basic properties of the selected requirement (or project node) in addition to the status of all tests that have been assigned to the requirement (number and percentage of Passed, Failed, Not Executed, and Not Covered tests). A summary of all assigned tests is listed under Total. To view the status of all tests assigned to child requirements of the selected requirement in addition to all tests directly assigned to the requirement, check the Full coverage check box. Note: Document View displays the same coverage status information in a heat field chart, with green indicating passed tests; red indicating failed tests; brown indicating tests that have not yet been executed; and gray indicating tests of other status. Requirements that are not covered by test definitions are listed as Not covered. Note: Test definition totals accumulate to the parent level (for example, requirement totals include test definitions from child requirements; project totals include test definitions from all requirements). Requirement Name / Project Name Priority Risk # Requirements (Calc.) Name of the selected requirement or project. Priority that has been assigned to the selected requirement. Risk that has been assigned to the selected requirement. Total number of all covered requirements. Not included in this number are uncovered requirements and folders that do not have a test definition assigned to them (for example folders that inherit the coverage of their child requirements but are not actually a requirement themselves). # Requirements (Total) Total number of all requirements beneath the selected entity, including folders. Requirement Status / Project Status Status of the selected requirement or project. # Requirements Passed Total and percentage of requirements in the project that have test definitions that have passed. # Requirements Failed Total and percentage of requirements in the project that have test definitions that have failed. # Requirements Not Executed Total and percentage of requirements in the project that have test definitions that have not been executed. # Requirements Not Covered Total and percentage of requirements in the project that are not covered by test definitions.

Related Concepts Requirements Management Requirements Document View Full Coverage and Direct Coverage Modes Requirements Reports Related Procedures Assigning Test Definitions to Requirements Manually Switching Between Full and Direct Coverage Modes

671

Requirement History tab


Test Manager Requirements History The History tab details the revision history of the selected requirement or project.
Item Description

Rev Changed On Changed By Notes

Revision number. Date the revision occurred. User who performed the revision. Auto-generated description of the nature of the revision (for example, deleted or created).

Note: When the History tab includes more elements than can be displayed at once without impacting response time, elements are displayed in increments. Page number links at the bottom of the tab allow you to browse through the elements included on the tab one page at a time. To display all elements as a single list, select the [All] link. Related Concepts Requirement History Related Procedures Tracking the History of a Requirement Viewing Recent Changes

672

Test Plan Unit Interface


This section contains information about the user interface elements in Test Manager's Test Plan unit. In This Section Test Plan Document View Document View provides a high-level view of the most important information, the status, and the last executions of all test plans in the selected project. Test Plan Grid View Grid View facilitates the filtering, sorting, and grouping of large numbers of test definitions. Test Plan Properties tab The Properties tab offers detail on all properties and relevant information for the selected test definition, test folder, test package or test container. Test Plan Steps Page The Steps page lists all manual test steps that are associated with the selected test definition. Test Plan Contents Tab The Contents tab (Test Plan Contents) enables you to view, cut, copy, and paste the child elements of any selected test plan element (test container, folder, or test definitions). Test Plan Attributes tab The Attributes tab in Test Plan View allows you to see all project attributes that have been assigned to the selected test definition. Test Plan Parameters tab The test parameters that have been assigned to the selected test definition can be viewed on the Parameters tab (Test Plan View only). Test Plan Assigned Requirements tab The Assigned Requirements tab lists the requirements that have been assigned to the selected test definition or project. Test Plan Attachments tab The Attachments tab lists all files and links that have been uploaded as attachments to test containers, folders, and test definitions. Test Plan Assigned Executions tab The Assigned Executions tab lists all execution definitions that are assigned to the selected test definition or project. Test Plan Runs tab The Runs tab offers a listing of test execution results for the selected test definition. Test Plan Issues Page The Issues page enables you to enter and track issues related to the selected test definition. Test Plan History tab The History tab details the revision history of the selected test definition, test container, or folder. Test Plan Data Set tab The Data Set tab lists all data that have been defined for data-driven testing with this test definition. Test Plan Toolbar Functions The Test Plan toolbar provides important commands for managing test plans. Test Definition Run Results Dialog The Test Definition Run Results dialog lists run details of a test definition.

673

Test Plan Document View


Test Manager Test Plan Click Document View to access Document View. Use Document View to view of the most important information, the status, and the last executions of all test plans in the selected project. Document View provides a high-level view of the most important information, the status, and the last executions of all test plans in the selected project.
Item Description

Status

Displays the status of the last test execution (Passed, Failed, Not Executed, Not Scheduled). For test containers and folders, a number in brackets shows the total of test definitions within the respective container/folder. Last Execution Last execution related to the selected test definition or project. Last Build Build associated with the last execution. Changed On Last time the selected test definition or test plan was changed. Changed By User who last changed the selected test definition or test plan. Related Concepts Test Plan Management Related Procedures Managing Test Plans

674

Test Plan Grid View


Test Manager Test Plan Grid View The Test Plan unit's Grid View complements the unit's Document View and Test Plan View by facilitating the filtering, sorting, and grouping of large numbers of test definitions. Grid View makes it easier to identify match points between test definitions and find specific test-definition information. Standard Windows keyboard shortcuts can be used, making it easy to select and manipulate specific sets of test definitions within Grid View. You can execute trial runs of tests and link directly to test definitions through context-menu commands that are available on rows within Grid View. You can even create execution definitions through multi-selecting test definitions within Grid View. Grid View offers a number of view-customization features that can help you better manage large numbers of test definitions. You can display or hide columns, adjust the width of columns, and move columns around using dragand-drop. To enhance performance when you have numerous test definitions, page views are broken into views of 50 test definitions each. You can advance through pages using the First, Last, Next, and Previous buttons at the bottom of the Grid View. Or you can enter a page number into the Page field and hit the ENTER key. Sorting, grouping, and filtering functions are available through context-menu commands to help you better organize your test definitions, group test definitions, and identify matching points between test definitions. All of your viewcustomization preferences are saved along with your project and will be available to you the next time you visit Grid View. Related Procedures Working With Test Definitions in Grid View

675

Test Plan Properties tab


Test Manager Test Plan Properties The Properties tab offers detail on all properties and relevant information for the selected test definition, test folder, test package, or test container. For test definition nodes, the properties listed here are configured when test definitions are created.
Item Description

Test Definition Name Test Definition ID Description

Name that has been configured for the test definition. Database identifier of this test definition. Any description that has been configured for the test definition. Test Manager supports HTML formatting and cutting/pasting of HTML content for Description fields. Status Status that has been configured for the test definition. For test definitions that are part of a running execution definition, the status is updated in response to the current status of the test run. If the current run is aborted, the status is reset to the status before the run. Last Execution Last time this test definition was executed. For test definitions that are part of a running execution definition, the last execution is updated based on the current test run. Created On Date this test definition was created. Created By Name of the user who created this test definition. Changed On Date this test definition was last changed. Changed By Name of the user who last changed this test definition. Planned [hh:mm] Planned execution time of the test definition. This property is only displayed if a manual test definition is selected. Test Properties Test properties are specific to test type. Success Conditions Shows the names of all success conditions that have been configured for the test definition; whether or not each condition is active; the maximal value of each condition; and whether or not each condition is inherited. For test package nodes, all success conditions except the execution time-out are disabled and hidden. Integration Default Folder Shows the name of the default container or folder, where tests from external RMSs are created. Related Concepts Test Plan Management Test Definitions Success Conditions Related Procedures Editing Test Plan Elements

676

Test Plan Steps Page


Test Manager Test Plan Steps The Steps page lists all manual test steps that are associated with the selected test definition. You can view, cut, copy, and paste manual test steps on the page. The page supports standard Windows Explorer style multi-select functionality. The page includes the following toolbar items:
Toolbar Item Description

New Step Insert Step Edit Delete Cut Copy Paste Move Up Move Down Manage Attachments

Add a new test step to the end of this test definition. Insert a new test step into the sequence of this test definition. Edit the selected test step. Delete the selected test step from the Test Steps list. Cut the selected test step from the list and move it to the clipboard. Copy the selected test step to the clipboard. Paste a copy of the test step held on the clipboard to the row above the selected in the list. Move the selected test step one position up in the Test Steps list. Moves the selected test step one position down in the Test Steps list. Opens the Attachments dialog box, where you can perform the following actions: Upload File Attach Link Edit Delete Upload a file to the selected test step. Attach a link attachment to the selected test step. Edit the file or link attachment. Delete the file or link attachment.

The Step page shows all steps of the selected test in a table. The table has the following columns:
Column Description

Order Name Action Description Expected Results Attachments

Number of the step in the execution sequence. Name of the test step. Action you must perform to execute the test step. Expected result of the test step. Amount of files that are attached to the test step.

Note: When there are more than 200 steps in a test, the steps are displayed in multiple pages. Click on the page numbers to access the pages. To display all steps as a single list, click [All]. Related Concepts Test Plan Management Related Procedures Editing Manual Test Steps From Within Test Manager Managing Test Plans Related Reference Multi-Select Functionality for Test Plan Elements

677

Test Plan Contents Tab


Using the Contents tab (Test Plan Contents), you can view, cut, copy, and paste the child elements of any selected test plan element (test container, folder, or test definition). Standard Windows Explorer style multi-select functionality is supported on the Contents tab. Tip: To drill down into the selected folder or container, press ENTER or double-click the selected item. Press BACKSPACE or click Up on the toolbar to navigate one level up. Note: Containers cannot be copied or pasted.
Item Description

Name Name of the child test plan element Changed On Date the child test plan element was last edited. Changed By User who last edited the child test plan element. Tip: As with test plan elements listed in the Test Plan tree, elements listed on the Contents tab can be right-clicked to access context-relevant commands through a context menu. Commands that are not available are grayed out. Before you can paste a test plan element into the Contents tab you must explicitly select an element within the tab to gain the application's focus. Note: When the Contents tab includes more elements than can be displayed at once without impacting response time, elements are displayed in increments. Page number links at the bottom of the tab allow you to browse through the elements included on the tab one page at a time. To display all elements as a single list, select the [All] link. Related Concepts Test Plan Tree Test Plan Management Related Procedures Copying, Pasting, and Deleting Test Plan Elements Managing Test Plans Related Reference Multi-Select Functionality for Test Plan Elements

678

Test Plan Attributes tab


Test Manager Test Plan Attributes The Attributes tab in Test Plan View allows you to see all project attributes that have been assigned to the selected test definition. Attributes are administrator-created characteristics that can be applied to tests. Examples include a platform attribute that can be applied to product components and a priority attribute that can be applied to test definitions.
Item Description

Attribute Value Type Inherited

Name of the attribute The attribute value that has been assigned Attribute type. Is the attribute inherited from a parent

Note: Inheritance of attributes is similar to inheritance of properties and success conditions. Attributes that are assigned to a parent node are inherited throughout all sub-folders and child test definitions. Related Concepts Test Plan Management Related Procedures Configuring Test Definition Attributes Creating Test Definitions

679

Test Plan Parameters tab


Test Manager Test Plan Parameters The test parameters that have been assigned to the selected test definition can be viewed on the Parameters tab (Test Plan View only).
Item Description

Parameter Value Type Inherited

Name of the assigned parameter The selected parameter value for this test definition Parameter type (String, Number, Float, Boolean, Password, or Character) Indicates if the parameter has been inherited from a parent

Note: Test definition parameters that are contained within a property of a test definition (for example, testdata for SilkTest test definitions) are listed at the top of the Parameters tab. Unused parameters are appended to the bottom of the list and grayed out (analogous to a disabled state). Note: When the Parameters tab includes more elements than can be displayed at once without impacting response time, elements are displayed in increments. Page number links at the bottom of the tab allow you to browse through the elements included on the tab one page at a time. To display all elements as a single list, select the [All] link. Related Concepts Test Definition Parameters Test Plan Management Related Procedures Configuring Test Definition Parameters Managing Test Plans

680

Test Plan Assigned Requirements tab


Test Manager Test Plan Assigned Requirements The Assigned Requirements tab lists the requirements that have been assigned to the selected test definition or project.
Item Description

Requirement Priority Risk Reviewed Actions

Name of the assigned requirement. Click to open the Requirements Properties page. Priority of the requirement. Potential risk associated with the requirement. Review status of the requirement. Actions that can be performed on the selected requirement (Remove Requirement, Locate Requirement, and View Description). Available Requirements The Available Requirements tree lists all requirements that can be assigned to the selected test definition. Related Concepts Test Plan Management Related Procedures Assigning Requirements to Test Definitions Managing Test Plans

681

Test Plan Attachments tab


Test Manager Test Plan Attachments The Attachments tab lists all files and links that have been uploaded as attachments to test containers, folders, and test definitions.
Item Description

Name Size Description Created On Created By Actions Include Child Attachments

Name of the attachment. Size of the attached file. Description that has been defined for the attachment. When the attachment or link was uploaded. User who uploaded the attachment. Actions that can be taken on the attachment (Edit or Delete). Check this check box to additionally display all attachments of child test definitions, folders, and test containers of the selected node.

Related Concepts Attachments Test Plan Management Related Procedures Attaching Files to Test Plan Elements Working with Attachments

682

Test Plan Assigned Executions tab


Test Manager Test Plan Assigned Executions The Assigned Executions tab lists all execution definitions that are assigned to the selected test definition or project.
Item Description

Execution Definition Assignment Type Last Execution Next Execution Related Concepts

Name of the assigned execution definition. Click to view or edit the execution definition. Execution definition type (manual or automated). Last time the test definition was executed as part of the assigned execution definition. Next scheduled execution of the test definition as part of the assigned execution definition.

Test Plan Management Test Definition Execution Related Procedures Manually Assigning Test Definitions to Execution Definitions Managing Test Plans Executing Test Definitions

683

Test Plan Runs tab


Test Manager Test Plan Runs The Runs tab is available on test-definition nodes in Test Plan view and offers a listing of test execution results for the selected test definition. The data grid representation of the Runs tab facilitates the filtering, sorting, and grouping of large numbers of test definition runs. To compare two test definition runs, use your keyboard's CTRL and SHIFT keys to select the two runs. Right click on your selection and click Reports Test Definition Run Comparison.
Item Description

Actions

Actions that you can perform on the test definition run.

Create a new issue for this test definition Click to open the New Issue dialog and create a new issue for the test definition. Run Type The Run Type column shows the test definition type during each run. The test type might change between two runs, for example when you convert the test from manual to automated. Run ID The ID of the test definition run. Click to open the Test Definition Run Results dialog box. If the test definition is running, click to view details of the execution. Start Time Time the run started. If the test is a manual test and currently running, Test Manager adds (Running) to the date and time. Execution Definition Name The name of the assigned execution definition, or unassigned tests if the execution was a try-run or results were uploaded. Click to open the execution definition. Status Status of the execution. For test definitions that are part of a running execution definition, the status is updated in response to the current status of the test run. If the current run is aborted, the status is reset to the status before the run. Issues Found Displays the amount of issues that are assigned to the test definition run. When no issues are assigned to the test definition run, the column is empty. Click on the link to access the issue in the Issues page of the Test Plan unit. Executed By The execution server from which the test was run. Errors Number of errors that were generated during the run. Warnings Number of warnings that were generated during the run. Version Version that the test was run against. Build Build number that the test was run against. Related Concepts Test Definition Run Comparison Report Related Procedures Test Definition Execution Related Reference Test Definition Run Results Dialog

684

Test Plan Issues Page


Test Manager Test Plan Issues The Issues page enables you to enter and track issues related to the selected test definition, container, or folder.
Item Description

Actions Actions that can be performed on the issue. Issue ID ID that has been automatically assigned to the issue. Assigned Test Definition Test definition that has been assigned to the issue. This column is only displayed if the currently selected object is a container or a folder. Synopsis Synopsis of the issue. Status Status of the issue. External ID Indicates if the issue is tracked by an external issue tracking system. Click an external issue number to link directly to the external issue tracking system. Test Definition Run The ID of the test definition run that the issue is assigned to. Click on the ID to access the Details page of the Test Definition Run Results dialog box in the Execution unit. Created On When the issue was created. Created By User who created the issue. New Issue Assign a new issue to the selected test definition. This button is only displayed if the currently selected object is a test definition. Assign External Issue Assign an issue from an external issue tracking system to the selected test definition. This button is only displayed if the currently selected object is a test definition. Related Concepts SilkCentral Issue Manager Related Procedures Creating New Issues Working with Issues

685

Test Plan History tab


Test Manager Test Plan History The History tab details the revision history of the selected test definition, test container, or folder.
Item Description

Rev Changed On Changed By Notes

Revision number. Date the revision occurred. User who performed the revision. Auto-generated description of the nature of the revision (for example, deleted or created).

Note: When the History tab includes more elements than can be displayed at once without impacting response time, elements are displayed in increments. Page number links at the bottom of the tab allow you to browse through the elements included on the tab one page at a time. To display all elements as a single list, select the [All] link. Related Procedures Tracking Test Plan History Viewing Recent Changes

686

Test Plan Data Set tab


Test Manager Test Plan Data Set The Data Set tab lists all data that have been defined for data-driven testing with this test definition. The Filter query row at the top of the tab (see the UI item descriptions below) shows you the filter value that has been defined for this data set. The values of the configured data set are displayed below this row. Note: Data sets and filters are defined through the SilkCentral Administration Module.
Item Description

Property Value Inherited Actions

Filter element. This field typically has a value of Filter query. Filter value used to filter the contents of the configured data set. Indicates whether or not the filter was inherited from a parent test container or test definition. Actions that can be performed on the filter (Edit or Delete).

Note: When the Data Set tab includes more elements than can be displayed at once without impacting response time, elements are displayed in increments. Page number links at the bottom of the tab allow you to browse through the elements included on the tab one page at a time. To display all elements as a single list, select the [All] link. Related Concepts Data-Driven Tests Related Procedures Creating Test Definitions Working with Data-Driven Tests Managing Test Plans

687

Test Plan Toolbar Functions


Test Manager Test Plan The Test Plan toolbar provides important commands for managing test plans.
Item Description

Test Plan View, Document View Up New Test Container, New Test Folder, New Test Definition Edit, Delete Cut, Copy, Paste Move Up, Move Down Find/Replace

Filtering commands Show changes/Acknowledge changes

Toggles between Test Plan View and Document View. Navigates one level up in the hierarchy of the navigation tree, regardless of the current cursor focus. Enables creation of new test containers, test folders, and test definitions. Edit and deletion of test plan elements. Cut, copy, and paste of test plan elements. Move test plan elements up or down within the Test Plan tree view. Find enables you to search through all test plan elements in the active project based on configurable parameters. Replace enables you to optionally replace instances of found values with a new value. Test Plan View filtering options. Show recent changes and acknowledge changes to test plan.

Related Concepts Test Plan Management Test Plan Document View Related Procedures Managing Test Plans Test Definition Execution

688

Test Definition Run Results Dialog


Test Manager Execution Runs The Test Definition Run Results dialog lists run details of a test definition. The Test Definition Run Results dialog can be accessed from the following locations within Test Manager:

Test Manager Test Manager Test Manager


definition>
Tab

Test Plan Execution Projects

Runs Runs Activities

Run ID <assigned test definition> <execution definition in Last Executions> <assigned test

Description

Details

Shows the details of the test definition run, including its Duration, Execution Path, the Execution Definition Run ID of the execution definition run that included the test definition run, and any Warnings/Errors. This tab also allows you to change the status of the test definition run. This option is useful if you need to manually overrule the status of a test run. Check the Hide Passed check box below the Assigned Test Definitions in the Execution Definition Run Results Dialog to show all test definitions. The default setting shows only the not passed test definitions to enhance performance. Only a part of the total test definitions have to be displayed. Additionally the information presented is of more use to the viewer. All parent nodes are displayed with the full status information. When a manual status change is performed, the details of the change are reflected in this tab's Status, Status Changed On, Status Changed By, Previous Status, and Status Change Comment fields. Only displayed for SilkTest, SilkPerformer, and manual test definitions. This tab includes details that are specific to the selected test definition type. For example, when a SilkTest test definition is selected, this view includes the selected test case, test data, and any warnings that were displayed during the test run. Lists all files that were generated by this test run, along with file sizes. The names of SilkTest .rex files act as download links. Once downloaded, these files can be viewed directly in a text editor. The upper table lists files that are associated with the test definition, such as result files or manually uploaded files for manual test definitions. The lower table lists files that are associated with the execution definition, for example execution log files or code analysis results. This tab also contains a button to download all result files: Download All Files Download all result files generated by the test definition run as a zipped package. Lists all messages that were generated by this test run, along with the severity of the messages.

Specific

Files

Messages

Messages that are associated with an execution definition as a whole, and not to one of the individual test definitions, can be viewed in the Projects unit (Activities tab/ Messages tab). Success Conditions Only displayed for automated test definitions. This tab shows all the success conditions that were defined for the test during the test planning process (Test Plan unit, Properties tab) and the result values from the execution run. Success conditions are used to determine if a test is successful or if it has failed. Data Driven Only displayed for data-driven test definitions using the option of having a single test definition for all data rows of the data set. This tab lists the status of each instance (data

689

Attributes Parameters

row) run of the test definition. Clicking an instance brings up another instance of the Test Definition Run Results dialog with run details of the selected instance. Any attributes that have been configured for the test definition. Any parameters that have been configured for the test definition.

The following table lists the UI elements that are used to step through the test definition results of an execution run. These elements are only visible when accessing the Test Definition Run Results dialog from an execution definition.
Item Description

Skip Passed

Used to determine which test definition run results should be displayed when browsing using the Previous Result and Next Result buttons. Checking this option only displays test definitions with a status other than Passed. < Previous Result Jumps to the result details of the previous test definition in the selected execution definition run. Next Result > Jumps to the result details of the next test definition in the selected execution definition run. Related Concepts Execution Dependency Configuration Related Procedures Viewing Test Execution Details Configuring Execution Dependencies Related Reference Activities Page Execution Runs Tab Test Plan Runs tab Execution Definition Run Results Dialog

690

Execution Unit Interface


This section contains information about the user interface elements in Test Manager's Execution unit. In This Section Execution Document View Document View provides a high-level, read-only view of properties for all executions in the selected project. Execution Properties tab The Properties tab lists basic properties of selected execution definitions. Execution Assigned Test Definitions Tab The Assigned Test Definitions page lists all test definitions that have been assigned to the selected execution definition. Execution Setup/Cleanup tab The Setup/Cleanup tab lists the setup and cleanup execution definitions that have been defined for this execution definition. Execution Schedule tab The Schedule tab is used to define schedules for execution definitions. Execution Deployment tab The Deployment tab displays all of the hardware-provisioning keywords that have been defined for the execution definition. It also displays the users who are assigned to execute manual tests and the SilkTest AUT hosts that are assigned to execute SilkTest tests. Execution Dependencies tab The Dependencies tab lists dependent execution definitions and master execution definitions of the selected execution definition. Execution Notifications Page The Notifications page allows you to check whether you want to be notified when the execution status changes. Execution Runs Tab The Runs page shows statistics for the runs of the selected execution definition. Current Run Page The Current Run page shows all information related to the active run of the selected manual test. Run Dialog The Run dialog box enables you to specify which test definitions you want to execute based on filter criteria and to specify which product build the test should be run against. Execute Test Dialog Box Displays when executing a manual execution definition in the Manual Testing Client. Test Definition Run Results Dialog The Test Definition Run Results dialog lists run details of a test definition.

691

Execution Document View


Test Manager Execution Click Document View to access Document view. The Execution unit offers three views of execution properties: Document View, and Execution View. Document View View provides a high-level, read-only view of properties for all executions in the selected project. The properties on this tab are defined when execution definitions are created.
Item Description

Status Build Version Product Priority Last Execution Duration Next Execution Test Container

Status of the execution. Build that the execution was based on. Version that the execution was based on. Product that the execution was based on. Priority of the execution. Last time the execution occurred. Duration of the execution. Next scheduled execution. Test container containing the test definition that this execution is based on.

Related Concepts Test Definition Execution Related Procedures Analyzing Test Runs

692

Execution Properties tab


Test Manager Execution Properties The Properties tab lists basic properties that are relevant to the selected project, folder, or execution definition.
Item Description

Execution Definition Name Execution Definition ID Description Test Container Version Build Priority SilkTest AUT Host Name Test Definitions Source Control Label

Name of the execution definition. Database identifier of the execution definition. Meaningful description of the execution definition. Test container the execution definition is associated with. Click to access the test container in the Test Plan unit. Product version the execution definition is associated with. Product build the execution definition is associated with. Priority of the execution definition. Hostname of the application under test (for SilkTest tests only). Test definitions associated with this execution definition. In the Source Control Label field you can optionally specify that the execution definition be of an earlier version than the latest version. The label must refer to a version in the source control system that the test container is associated with. If this field is left blank, the latest version of the execution definition will be fetched. The Source Control Label property is only enabled if the associated test container uses a source control profile that supports versioning. Make sure to have enough free disk space on the execution server or servers when working with multiple versions of source files. Each version will be saved in its own folder on every execution server. Reflects whether or not the test definitions of the latest run of an execution definition Passed, Failed or were Not Executed. Last time this execution definition was executed. Length of time required for execution of an execution definition. In the simplest case (automated tests on a single execution server; or only manual tests) duration is the time displayed for the latest run on the Runs tab. If the last execution involved automated tests that were executed on more than one execution server, duration on the server on which execution lasted the longest is considered. If the last execution involved both automated and manual tests, only the automated or manual tests will be considered, depending which tests last the longest. This is because automated and manual tests are executed in parallel. If the execution definition contains multiple test definitions, the duration is measured from the time when the first test definition begins executing and ends when the last test definition completes execution. This includes the overhead time, which is needed for stopping/ starting test definitions between executions. If an execution definition contains only a single test definition, this overhead is not included in the duration. Next time this execution definition will be executed. Time the execution definition was created. User who created the execution definition. When the execution definition was last changed. User who last changed the execution definition.

Status Last Execution Duration

Next Execution Created On Created By Changed On Changed By

693

Related Concepts Test Definition Execution Related Procedures Adding Execution Definitions Working with Execution Definitions Executing Test Definitions

694

Execution Assigned Test Definitions Tab


Test Manager Execution Assigned Test Definitions The Assigned Test Definitions page lists all test definitions that have been assigned to the selected execution definition. Use this page to assign additional test definitions to the execution definition, to remove test definitions from the execution definition, or to change the execution order of the assigned test definitions. The following are the displayed items in the Assigned Test Definitions page: Note: All changes in this page are applied only when you click Apply.
Item Description

Manual assignment

Click the Manual assignment option button to manually assign test definitions to the execution definition. Use test plan order Check to set the execution order of the assigned test definitions to follow the execution order in Test Manager's Test Plan unit. Assign Saved Selection Click to assign a selection of test definitions from the Grid View. Assignment by filter Click the Assignment by filter option button to automatically assign test definitions to the execution definition based on a pre-defined filter. The available filters are listed in the list box. Assigned Test Definitions List The following properties are shown for each assigned test definition: Order The execution order of the test definition. The Use test plan order check box must be unchecked to change the execution order. Click in the text box, type the new order of the test definition, and then press Enter to confirm the change. Each change in each of the text boxes must be confirmed by pressing Enter. If you change the orders of multiple test definitions without pressing Enter each time, just the last change before pressing Enter is taken. Test Definition Name of the test definition. Click on the name to access the test definition in the Test Plan unit. Status Status of the last run of the test definition in the context of the execution definition. When the test definition is executed outside of the context of the execution definition, the displayed status remains unchanged. If the test definition has not yet been executed in the context of the execution definition, the status is Not Scheduled. Last Execution Date and time of the last run of the test definition in the context of the execution definition. When the test definition is executed outside of the context of the execution definition, the displayed time and date remain unchanged. Actions The following actions can be performed on the assigned test definitions when the Manual assignment option button is clicked:

Available Test Definitions

Remove Click to remove the selected test definition from the list. Locate Click to locate the selected test definition in the test plan tree. This window shows all test definitions in the test plan tree that are available for assignment to the selected execution definition. Use the arrows to assign the test definitions to the execution definition. For information about inserting multiple test

695

definitions from Test Manager's Test Plan unit to the execution definition, see Assign Test Definitions from Grid View to Execution Definitions. Related Concepts Test Definition Execution Related Procedures Assigning Test Definitions to Execution Definitions Working with Execution Definitions Executing Test Definitions Assign Test Definitions from Grid View to Execution Definitions

696

Execution Setup/Cleanup tab


Test Manager Execution Setup/Cleanup The Setup/Cleanup tab lists the setup and cleanup execution definitions that have been defined for this execution definition. Note: When failed tests are rerun, the corresponding setup/cleanup routines are also rerun. Note: Setup/cleanup test definitions are not run with Try Run test runs because such executions do not rely on execution definitions.
Item Description

Test Definition (Setup Test Definition) Edit (Setup Test Definition)

Name of the configured setup test definition. Opens the Edit Setup Test Definition dialog box where you can select a setup test definition. A test definition can not be simultaneously assigned to the same execution definition as both a setup test definition and a regular or cleanup test definition.

Assigned test definitions can come from any test container within your project. It is therefore possible to assign test definitions that have associated products and source control profiles that vary from their host execution definitions. Test Definition (Cleanup Test Definition) Name of the configured cleanup test definition. Edit (Cleanup Test Definition) Opens the Edit Cleanup Test Definition dialog box where you can select a cleanup test definition. A test definition can not be simultaneously assigned to the same execution definition as both a setup test definition and a regular or cleanup test definition. Assigned test definitions can come from any test container within your project. It is therefore possible to assign test definitions that have associated products and source control profiles that vary from their host execution definitions. Related Concepts Setup and Cleanup Test Definitions Test Definition Execution Related Procedures Configuring Setup and Cleanup Executions

697

Execution Schedule tab


Test Manager Execution Schedule The Schedule tab is used to define schedules for execution definitions.
Item Description

None Global

Click this option button to not have a schedule defined for the execution definition. Click this option button to select a pre-defined schedule from the list box for the execution definition. Selecting a global schedule includes the schedule exclusions and definite runs which are defined in the global schedule. See the SilkCentral Administration Module Help for information on defining global schedules.

Selecting a global schedule displays the schedule details below the Custom option button. Click this option button to define a custom schedule for the execution definition. Click Edit to edit the custom schedule in the fields below. Schedule details area The bottom part of this page displays the schedule details of the selected global schedule or the custom schedule. If custom schedule is selected, the fields are editable. Custom From Specify when the execution schedule is to begin (Month, Day, Year, Hour, Minute). Click next to the specified date to access the calendar tool. Interval Specify the interval at which the execution's tests are to be executed (Day, Hour, Minute). Adjust schedule to daylight savings Check this check box to automatically have your schedule adjust to daylight savings time. Note that daylight adjustment only works for intervals of two-hour multiples to avoid duplicate runs when setting time back one hour. Run In the Run portion of the GUI, specify when the execution is to end: Forever Click this option button to specify that the execution is to have no end. Time(s) Click this option button and select a number from the list box to define a specific number of executions. Until Click this option button to pick a specific time at which test executions are to end. Click next to the specified date to access the calendar tool. A schedule exclusion is a regularly occurring time period during which executions should be suspended (for example, weekly planned system downtime, weekends). You can add as many schedule exclusions as are required. To define an exclusion, click the Add Exclusion button. Place check marks next to the days for which the exclusion should be in effect. Using the From list boxes, select the hour and minute when the exclusion should begin. Using the To list boxes, select the hour and minute when the exclusion should end. Click OK to save your changes, or click Cancel to abort.

Exclusion

698

Definite Runs

A definite run is an execution that you schedule to run independent of the configured schedule. You can add as many definite runs as are required. To add a definite run, click Add Definite Run. Click next to the specified Run at date to access the calendar tool and specify when the definite run is to take place. Click OK to save the definite run, or click Cancel to abort.

Warning: If test definitions assigned to an execution definition schedule are not executed, this might be caused by too many running tests. In such a case, test definitions that are already included in a schedule are not executed when triggered manually or by a schedule. Click the Application Server Log tab in Administration Reports to view the application server logfile. If there is a warning in the logfile that states that the schedule interval might be to short, increase the schedule interval. Related Concepts Execution Definition Schedules Related Procedures Creating a Custom Schedule for an Execution Definition Specifying Global Schedules for Execution Definitions Defining Execution Definition Schedules

699

Execution Deployment tab


Test Manager Execution Deployment The Deployment tab displays all of the hardware-provisioning keywords that have been defined for this execution definition. These keywords are used to describe the execution environment requirements for the execution definition. An execution server only matches the selected automated execution definition if it has all keywords assigned the execution definition requires. The Deployment tab also displays the users who are assigned to execute manual tests, as well as the SilkTest AUT hosts that are assigned to execute SilkTest tests. Note: New execution servers are setup in theSilkCentral Administration Module (Locations link). See SilkCentral Administration Module documentation for details. Note: See SilkCentral Administration Module documentation for details on configuring Test Manager's integration with VMware Lab Manager.
Item Description

Keywords

Lists the keywords that have been assigned to this execution definition.

Automated execution definitions Keywords are used to automatically identify an appropriate execution server for each test execution. Manual execution definitions Keywords are used by the manual tester to reflect the test environment. Edit Click to edit this execution definition's keywords. Matching execution servers Lists the active execution servers that have keyword lists that match the keywords list of this execution definition. All keywords in the keywords list of the execution definition must be included in the keyword list of the execution server. Click on the name of an execution server in the list to access the execution server list in Administration Locations. Capturing Options The following VMware LiveLink capturing options are available for VMware Lab Manager configurations: Note: Only VMware Lab Manager configurations are captured. LiveLink URLs are attached to execution definition results (as links on the Messages tab and as separate html files that contain the LiveLinks). Never Don't capture configurations. Immediately on error Once a failed test definition is completed, no further test definitions are executed and the configuration is captured. After completing all test definitions Upon failure conditions, continue test execution and capture the configuration after executing all tests of the execution definition. Manual Testers SilkTest AUT Hostname Code Analysis Settings Always Capture configuration with each run of the test execution Lists all manual testers who have been assigned to this execution definition or folder. Click Edit to edit the list of manual testers. Lists all SilkTest AUT hosts that have been defined for this execution definition. Click Edit to edit the list of SilkTest AUT hosts. Details code-analysis settings that have been defined for this execution definition. Click the Inactive link to enable code analysis for this execution definition. For virtual execution on VMware Lab Manager configurations, the internal IPs of the affected machines within the configuration must be configured here.

700

Related Concepts Execution Definitions VMware Lab Manager Virtual Configurations Test Definition Execution Related Procedures Assigning Keywords to Execution Definitions Analyzing Code Coverage Configuring Deployment Environments

701

Execution Dependencies tab


Test Manager Execution Dependencies The Dependencies tab lists dependent execution definitions and master execution definitions of the selected execution definition. The Dependencies tab is divided into two parts. For the selected execution definition, the tab shows both the Master Execution Definitions (the execution definitions for which a specific Passed/Failed/Not Executed condition trigger the selected execution definition) and the Dependent Execution Definitions (the execution definitions that will be triggered if the selected execution definition results in a specific Passed/Failed/Not Executed condition).
Item Description

Name (Master Execution Definitions)

Name of the master execution definition that the selected execution definition is dependent upon. The read-only Master Execution Definitions portion of the tab includes the Name of all master executions of the selected execution definition. The specific Condition of each master execution definition that triggers execution of the selected execution definition is also listed. Condition of the master execution definition that must be met for the selected execution definition to be triggered. Name of the dependent execution definition that the selected execution definition serves as the master of.

Condition (Master Execution Definitions) Name (Dependent Execution Definitions)

The Dependent Execution Definitions portion of the tab includes the Name of all execution definitions that are dependent on the selected execution definition. The specific Condition of the selected execution definition that triggers execution of each dependent execution definition is listed. The execution server where each dependent execution definition is to be executed is also listed. Condition (Dependent Execution Definitions) Condition of the selected execution definition that must be met for the dependent execution definition to be triggered. Execution Server / Manual Tester Execution server where the dependent execution definition is to be run (or, in the case of a manual test execution, manual tester who is to perform the manual test). Actions Actions that can be performed on the selected dependency (Edit or Delete). Related Concepts Execution Dependency Configuration Execution Definitions Related Procedures Configuring Execution Dependencies

702

Execution Notifications Page


Test Manager Execution Notifications The Notifications page includes check boxes which allow you to check whether you want to be notified based on the outcome of the execution. Notification only works if an email server has been configured by your administrator. You also have to specify an email address for your account in Administration Users Accounts. If notification has not been enabled, refer to the SilkCentral Administration Module Help or contact your administrator.
Check Box Description

Execution definition runs finishing successfully.

Check to receive a notification email each time an execution run finishes successfully. Execution definition runs finishing with not passed test Check to receive a notification email each time an definitions. execution finishes with status not executed or failed. Execution definition runs finishing with changed number Check to receive a notification email each time the of not passed test definitions. number of failed or not executed tests changes in comparison to the previous run, when an execution finishes.

703

Execution Runs Tab


Test Manager Execution Runs The Runs page shows statistics regarding all the runs of the selected execution definition. The tab is split into two separate sections, one listing the execution definition runs, and the second listing the test definition runs for the execution definition run selected in the first section. The grid views on the Runs page offer view settings, including resizing and reordering of columns, filtering, sorting, and grouping options that are configurable on a per-user basis. You can display or hide columns, adjust the width of columns, and move columns around by clicking on a column and dragging it to the desired location. You can use the keyboard to navigate through the runs. To delete an execution definition's run results, right-click on the execution definition in the execution tree and choose Delete Results.... To open the Execution Definition Run Results dialog, right-click on a run in the grid and choose View Details. To compare two execution definition runs, use your keyboard's CTRL and SHIFT keys to select the two runs. Right-click on your selection and click Reports Execution Definition Run Comparison .... The execution definition runs section lists the runs of the selected execution definition. The section is paged with fifty runs shown on a page. Use the arrow buttons to navigate through the pages. The following items are shown for each run:
Item Description

Actions

Actions that you can perform on the execution definition. Delete Run Results

Click to delete the results of this run. When you delete the results for selected runs, Test Manager removes the runs from the Runs page. The runs are grayed out until the background process completes the deletion. Alternatively, use the DELETE key on your keybord to delete the test run results. View Manual Test Results Click to view the Current Run page in read-only mode. Run ID Identifier of the execution definition run. Click to access the results of the run. Status Status summary of the run. A bar lists the amount of passed, failed, and not executed test definitions. The run status of each assigned test definition is shown in the second section. Keywords Keywords assigned to the execution definition. Executed By Name of the execution server on which the run was executed. For manual test definitions the name of the person who executed the run is listed. Errors Number of errors that occurred in the run. Warnings Number of warnings that occurred in the run. Product The application under test. Version Version of the product. This information can be set in Administration Configuration Products. Build Build number of the product version. This information can be set in Administration Configuration Products. Start Time Time the run started. Duration Duration of the test run in h/mm/ss. Start Type Shows how the test run was started. Manually, through a Web Service, or from a schedule. Starter Name Name of the schedule, tester, or Web Service user. Start Scope The scope specified in the Run dialog box. The test definition runs section lists the test definition runs for the selected execution definition run. The section is paged with fifty runs shown on a page. Use the arrow buttons to navigate through the pages. The following items are shown for each run:
Item Description

Actions

Actions that you can perform on the test definition run.

704

If the test definition, to which the run belongs, is of a test-definition type that generates result files, click on the icons to view or download the result files. Create a new issue for this test definition Click to open the New Issue dialog and create a new issue for the test definition. Run ID Identifier of the test definition run. Click to open the Test Definition Run Results dialog box. ID Identifier of the test definition. This column is hidden by default. Test Definition Name of the test definition. Click to access the test definition in the Test Plan unit. The icon corresponds to the test type. Start Time Date and time the run started. Status Status summary of the run. For a single test definition a single status is shown. A bar lists the amount of passed, failed, and not executed test definitions for a test package or suite node. Executed By Name of the execution server on which the run was executed. For manual test definitions the name of the person who executed the run is listed. Issues Found Displays the amount of issues that are assigned to the test definition run. When no issues are assigned to the test definition run, the column is empty. Click on the link to access the issue in the Issues page of the Test Plan unit. Errors Number of errors that occurred in the run. Warnings Number of warnings that occurred in the run. Related Concepts Test Definition Execution Execution Definition Run Results Dialog Execution Definition Run Comparison Reports Related Procedures Analyzing Test Runs Changing the Status of a Test Execution Run Deleting Individual Test Run Results Deleting the Results of an Execution Definition Related Reference Activities Page Test Definition Run Results Dialog

View or download results

705

Current Run Page


Test Manager Execution Current Run To access the Current Run page, click on the desired execution definition in the execution tree and choose the Current Run tab. The Current Run page shows information about the active manual test run and enables you to see who is working on which test step. You can execute the manual test run with a few clicks. The page features two grid views, Assigned Test Definitions and Test Steps. Assigned Test Definitions shows information about the active manual test run, and Test Steps shows information for each of the steps in the manual test. You can filter the test definition runs in the Assigned Test Definitions view by selected columns, through right-clicking the desired column. When multiple runs of a manual test definition are in progress, the one first started is shown. Three additional views display additional information on the execution definition, the active test definition, and the selected test step. The two grid views offer view settings that are configurable on a per-user basis. Display or hide columns, adjust the width of columns, and move columns around by clicking on a column and dragging it to the desired location. Use your keyboard's CTRL and SHIFT keys to select multiple test definitions or test steps using standard browser multiselect functions. Right click on your selection and set the status of the selected test steps to the selected status. You can perform the following general actions on the Current Run page:
Action Description

Reload Click to reload the Current Run page. Synchronize Run Click to update the tasks shown in the Current Run page. The following items are updated: Assigned test definitions Only when you start the run with scope Run all Tests and click Synchronize Run, test definitions that are newly assigned to the execution definition are shown, and test definitions that are no longer assigned to the execution definition are removed, if the test definitions are not already started. Test properties The name, the description, the attachments, and the other test properties of the assigned test definitions. Test step properties The name, the description, the attachments, and the other teststep properties of the steps in the assigned test definitions. Click this button to finish the test execution and to open the Finish Run dialog box. From the Build list box, choose the appropriate build. The build of the execution definition on which the manual test was started is preselected from the list. If there exist test definitions in the execution definition run with status Not Executed, you can choose an action to perform from the Status list box. The following elements are available in the dialog box:
Action Description

Finish Run

Remove test definitions from this run.

Set to Passed, Failed, Unsupported, or Unresolved

Select Build

Removes all test definitions with status Not Executed from the execution definition run. This action is only available if the run includes tests with status Not Executed. Sets all test definitions with status Not Executed to the selected status. This action is only available if the run includes tests with status Not Executed. Select the build number for the execution definition run from the list box.

The Execution Definition Run Details view displays the following information on the execution definition run, which has the active manual test run assigned:

The ID of the execution definition run.


706

The name of the execution definition. The keywords assigned to the execution definition run. The version of the execution definition's product. The build of the execution definition's product. The run type of the execution definition run. The start type of the execution definition run. The name of the tester, schedule, or Web Service user that started the execution definition run. The start scope of the execution definition run, specified in the Run dialog box. The start time of the execution definition run.
Note: You can show or hide the Execution Definition Run Details view by clicking on the arrows in the top-right corner. The view is collapsed by default. The Assigned Test Definitions view provides the following information for the manual test definition run:
Item Description

Actions

Actions that can be performed during the run. Some of these actions are not available for datadriven tests.
Action Description

Add Result File Add a result file to the test definition. New Issue Create a new issue for the test definition. Edit Test Definition Edit the test definition in the Edit Test Definition dialog box. When you close the dialog box, Test Manager automatically synchronizes the test run. # Order of the test definition in the execution definition run. ID Identifier of the test definition. This column is hidden by default. Test Definition Name of the test definition. Click on the name to view the test definition, or to perform an action on the test definition. Status Current status of the test. Click on the status to change it. Executed By Name of the user that has started the test run. The Test Steps view provides the following information for each test step:
Item Description

Actions

Actions that can be performed during the run.


Action Description

Edit the test step in the Edit Step dialog box. When you close the dialog box, Test Manager automatically synchronizes the test run. # Order of the step in the test. Step Name Name of the step. Click to access the step in Test Plan Steps. Status Execution status of the step. Click on the status to change it. Result Result of the step. Click on the text box to edit the result. Tip: You can hide the Test Steps view by clicking on the arrows in the top-right corner. The Test Definition Details view displays the following information for the active manual test definition:

Edit

707

The name of the test definition. The amount of attachments attached to the test definition. Click on the link to access the attachments. The amount of issues assigned to the test definition. Click on the link to access the issues. Result files generated for the test definition. Click on the links to access the result files. The description of the test definition.
The Step Details view displays the following information for the selected step:

The name of the step. A description of the action the step performs. The expected result of the step. The result of the step, when the test run is finished. The attachments to the step.
Tip: You can hide the Step Details view by clicking on the arrows in the top-right corner. If other execution definition runs are started while the Current Run page is open, a note displays, stating that newer runs are available. You can see information on those runs in the Activities page. For automated tests, the Current Run page shows the progress of the execution. Related Concepts Manual Test Definitions Test Definition Execution Calculating the Test Definition Status Related Procedures Executing Manual Tests Executing Manual Tests in the Current Run Page Related Reference Execution Unit Interface Execution Notifications Page

708

Run Dialog
Test Manager Execution The Run dialog box enables you to specify which test definitions you want to execute based on filter criteria and to specify which product build the test should be run against. To open the Run dialog box, select an execution definition or an execution folder and click Run on the toolbar.
Item Description

All test definitions Test definitions...

Select this option to execute all test definitions. Select this option to only execute test definitions that meet certain filter criteria. For example, Failed status, or test definitions that have not been executed since before a specified build number. Test definitions that have had issues fixed since their last Select this option to only execute those test definitions execution that have had issues advanced to the Fixed state since the test definition's last execution. Set build for execution definition Select a past build from the Set build for execution definition list box to have the test run against a specific past build. This field defaults to the current build. Note that this option is not available if the execution definition is configured to read the build number from a build information file. If an execution folder contains execution definitions with different product versions assigned to each, the build can not be selected for the execution of the execution folder. Run Type Choose Run as Specified to run all selected tests with their own test type, or choose Run automated tests manually to re-run all selected tests manually. Go to Activities page Check this check box to advance to the Activities page after you define test definitions for execution. Related Concepts Test Definition Execution Related Procedures Using the Manual Testing Client Executing Test Definitions Executing Individual Tests Executing Manual Tests Executing Manual Tests with the Manual Testing Client Executing Manual Tests in the Current Run Page

709

Execute Test Dialog Box


When you execute a manual execution definition in the Manual Testing Client the Execute Test dialog box displays. This dialog box enables you to execute the test steps that are included in the selected execution definition, and track the results of the test steps. The Execute Test dialog box includes the following tabs: Description, Details, Attachments, Result Files, and Issues. Description In normal mode, the Description tab offers a read-only Test Definition Description text box for the selected test definition and the Test Definition Name text box. In Edit mode, these text boxes can be edited. Test Definition Name Name of the selected test definition. Test Definition Description Description of the selected test definition. Open Description in Separate Window Click the button to the right of the Test Definition Description text box to open the most recently saved version of the Test Definition Description in a separate window. The detached window always remains on top to assist in manual testing. The Details tab offers a read-only description of the selected test definition. Following are descriptions of the text boxes and buttons on the Details tab of the Execute Test dialog box: Test Definition Status Current status of the test definition. In Edit mode this text box can be edited. Last Status Status of the test definition in the previous test run. Planned Time Estimated time for completion of the test in [hh:mm:ss]. In Edit mode this text box can be edited. Used Time This text box tracks elapsed time in [hh:mm:ss] since the beginning of test execution. This text box can be manually edited. The timer stops during editing. After editing this text box the timer continues tracking from the manually adjusted time. Start Initiates code analysis. Enter the host names for which you want to run code analysis. Test Steps Lists the manual test steps the selected test definition includes. You can select multiple test steps in the Test Steps window by using the standard Windows keyboard shortcuts. To change the status of the selected test steps, right-click on the steps and choose a new status value. Step Description Describes the action you must perform in this step. Expected Result The expected result or success condition of each test step. Result The result of each test step as observed by the tester. Edit this text box after you have completed this step. Status Status of each step. Edit this text box after you have completed this step. Execute Test Toolbars The Execute Test dialog box includes three toolbars: Use the Test Steps toolbar to manage your manual test steps during a test execution.

Details

710

Use the text formatting toolbar to formate the Test Definition Description, Step Description, and Expected Result descriptions. Click Parameters to insert parameters into your descriptions. Use the navigation toolbar to manage your manual test execution and to navigate between test definitions. The following buttons are included in the Execute Test Toolbars: Edit Go To Issues Edit the properties of the selected test definition. View the issues of the selected manual test definition in Test Manager, or assign new issues to the test definition. Internal Issue Add an internal issue to the selected test definition. Next Test Advance to the next test step in the manual test execution. Previous Test Return to the previous test step in the manual test execution. Finish Run Close the Execute Test dialog box when you have completed all test steps in the active manual test definition. Add Test Step Add a new test step to the end of the Test Steps list. Insert Test Step Insert a new test step above the selected test step in the Test Steps list. Duplicate Test Step Create a copy of the selected test step in the Test Steps list. Delete Test Step Delete the selected test step from the Test Steps list. Move Test Step Up Move the selected test step up one position in the Test Steps list. Move Test Step Down Move the selected test step down one position in the Test Steps list. Bold Apply bold formatting to the selected text. Italics Apply italicized formatting to the selected text. Underline Apply underlined formatting to the selected text. Align Left Align the selected text to the left side. Align Center Align the selected text to the center. Align Right Align the selected text to the right side. Justify Apply a justified alignment to the selected text. Bulleted List Convert the selected text to a bulleted list. Indent Left Apply a left-side indent to the selected text. Indent Right Apply a right-side indent to the selected text. Undo Change Undo the last action you performed in a text description text box. Redo Change Redo the last action you performed in a text description text box. Font Apply a different font type to the selected text. Font Size Apply a different font size to the selected text. Format Apply a different pre-defined formatting style to the selected text. For example, Heading 1, Heading 2, ... Parameters Insert preconfigured Test Manager custom step properties, which are also called project parameters, into text descriptions. In normal mode, the dialog box displays the parsed values of the resolved parameters. In Edit mode, the dialog box displays the actual parameters.

711

Related Concepts Manual Testing Client Test Definition Parameters Test Definitions in the Manual Testing Client Related Procedures Using the Manual Testing Client Editing Test Definitions Within the Manual Testing Client Adding an Internal Issue with the Manual Testing Client

712

Test Definition Run Results Dialog


Test Manager Execution Runs The Test Definition Run Results dialog lists run details of a test definition. The Test Definition Run Results dialog can be accessed from the following locations within Test Manager:

Test Manager Test Manager Test Manager


definition>
Tab

Test Plan Execution Projects

Runs Runs Activities

Run ID <assigned test definition> <execution definition in Last Executions> <assigned test

Description

Details

Shows the details of the test definition run, including its Duration, Execution Path, the Execution Definition Run ID of the execution definition run that included the test definition run, and any Warnings/Errors. This tab also allows you to change the status of the test definition run. This option is useful if you need to manually overrule the status of a test run. Check the Hide Passed check box below the Assigned Test Definitions in the Execution Definition Run Results Dialog to show all test definitions. The default setting shows only the not passed test definitions to enhance performance. Only a part of the total test definitions have to be displayed. Additionally the information presented is of more use to the viewer. All parent nodes are displayed with the full status information. When a manual status change is performed, the details of the change are reflected in this tab's Status, Status Changed On, Status Changed By, Previous Status, and Status Change Comment fields. Only displayed for SilkTest, SilkPerformer, and manual test definitions. This tab includes details that are specific to the selected test definition type. For example, when a SilkTest test definition is selected, this view includes the selected test case, test data, and any warnings that were displayed during the test run. Lists all files that were generated by this test run, along with file sizes. The names of SilkTest .rex files act as download links. Once downloaded, these files can be viewed directly in a text editor. The upper table lists files that are associated with the test definition, such as result files or manually uploaded files for manual test definitions. The lower table lists files that are associated with the execution definition, for example execution log files or code analysis results. This tab also contains a button to download all result files: Download All Files Download all result files generated by the test definition run as a zipped package. Lists all messages that were generated by this test run, along with the severity of the messages.

Specific

Files

Messages

Messages that are associated with an execution definition as a whole, and not to one of the individual test definitions, can be viewed in the Projects unit (Activities tab/ Messages tab). Success Conditions Only displayed for automated test definitions. This tab shows all the success conditions that were defined for the test during the test planning process (Test Plan unit, Properties tab) and the result values from the execution run. Success conditions are used to determine if a test is successful or if it has failed. Data Driven Only displayed for data-driven test definitions using the option of having a single test definition for all data rows of the data set. This tab lists the status of each instance (data

713

Attributes Parameters

row) run of the test definition. Clicking an instance brings up another instance of the Test Definition Run Results dialog with run details of the selected instance. Any attributes that have been configured for the test definition. Any parameters that have been configured for the test definition.

The following table lists the UI elements that are used to step through the test definition results of an execution run. These elements are only visible when accessing the Test Definition Run Results dialog from an execution definition.
Item Description

Skip Passed

Used to determine which test definition run results should be displayed when browsing using the Previous Result and Next Result buttons. Checking this option only displays test definitions with a status other than Passed. < Previous Result Jumps to the result details of the previous test definition in the selected execution definition run. Next Result > Jumps to the result details of the next test definition in the selected execution definition run. Related Concepts Execution Dependency Configuration Related Procedures Viewing Test Execution Details Configuring Execution Dependencies Related Reference Activities Page Execution Runs Tab Test Plan Runs tab Execution Definition Run Results Dialog

714

Code Analysis Unit Interface


This section contains information about the user interface elements in Test Manager's Code Analysis unit. In This Section Code Analysis Details tab The Details tab displays code-coverage information for selected products, versions, and builds at the product, package, and class levels. Select Classes for Report Dialog The Select Classes for Report dialog enables you to select class files to be included as sources in code change impact reports.

715

Code Analysis Details tab


Test Manager Code Analysis Details The Details tab displays code-coverage information for selected products, versions, and builds at the product, package, and class levels. Product level view displays a list of covered and not-covered packages for specific products and product builds. By clicking a package name in Product view you can drill down to view code-coverage information for the classes that are included in that package. The following attributes are displayed for selected products in Product view, in a single row:
Item Description

Name Product name Statements Total statements Packages (histogram bar view) Total percentage of packages that are covered Number of covered packages (in green) Classes Number of uncovered packages (in red) (histogram bar view) Total percentage of classes that are covered Number of covered classes (in green) Methods Number of uncovered classes (in red) (histogram bar view) Total percentage of methods that are covered Number of covered methods (in green) Number of uncovered methods (in red) Package level view displays a list of covered and not-covered classes for specific products and product builds. By clicking a class name in Package view you can drill down to view code-coverage information for the methods that are included in that class. The following attributes are displayed for selected packages in Product view, across multiple rows:
Item Description

Name Package name Statements Total statements Classes (histogram bar view) (histogram bar view) Total percentage of classes that are covered Number of covered classes (in green) Number of uncovered classes (in red) Methods (histogram bar view) (histogram bar view) Total percentage of methods that are covered Number of covered methods (in green) Number of uncovered methods (in red) Class level view displays a list of covered and not-covered methods for specific products and product builds. The following attributes are displayed for selected methods in Class view, across multiple rows:
Item Description

Name Method name Signature Method signature Statements Total statements

716

Covered

Covered status of method (True indicates that the method is covered. False indicates that the method is not covered.)

Note: When the Details tab includes more elements than can be displayed at once without impacting response time, elements are displayed in increments. Page number links at the bottom of the tab allow you to browse through the elements included on the tab one page at a time. To display all elements as a single list, select the [All] link. Related Concepts Code Coverage Analysis Related Procedures Viewing Code-Coverage Information for Packages Analyzing Code Coverage Related Reference Execution Deployment tab

717

Select Classes for Report Dialog


Test Manager Code Analysis Click Create Code Change Impact Report on the toolbar. The Select Classes for Report dialog box enables you to select class files to be included as sources in code change impact reports.
Item Description

Product Version Filter

Product for which code analysis information is required. Version of product for which code analysis information is required. Enter criteria for filtering the packages. For example, entering the string published will display only packages that contain the string published in their names. Packages Product packages that are to be analyzed. Classes Classes from package that are to be analyzed. Add Click this button to add the classes for code-coverage analysis. Related Concepts Code Coverage Analysis Related Procedures Viewing Code-Coverage Information for Packages Analyzing Code Coverage Related Reference Execution Deployment tab

718

Issues Unit Interface


This section contains information about the user interface elements in Test Manager's Issues unit. In This Section Issues Document View Document View presents issue statistics for the selected project in tabular format. Issues tab The Issues tab lists issues from both internal and external databases that have been configured for the selected project. Calendar Tool Describes the features of the calendar tool.

719

Issues Document View


Test Manager Issues Click Document View to access Document View. Document View presents issue statistics for the selected project in tabular format.
Item Description

Date Open Fixed Verified Closed Deferred

Date and time when issue details were updated. Number of issues in the selected project, database, or product that have a status of 'Open'. Number of issues in the selected project, database, or product that have a status of 'Fixed'. Number of issues in the selected project, database, or product that have a status of 'Verified'. Number of issues in the selected project, database, or product that have a status of 'Closed'. Number of issues in the selected project, database, or product that have a status of 'Deferred'.

Related Concepts Issue Management SilkCentral Issue Manager Related Procedures Viewing Issue Statistics in Document View Tracking Issues

720

Issues tab
Test Manager Issues Issues The Issues tab lists issues from both internal and external databases that have been configured for the selected project.
Item Description

Calendar Allows you to specify a time-frame for which issues should be reported. Update Click the time-frame date link to expand the calendar tool. Updates the Issues View based on calendar changes.

Related Concepts Issue Management SilkCentral Issue Manager Related Procedures Viewing Issue Statistics in Details View Specifying a Calendar Range Tracking Issues Related Reference Calendar Tool

721

Calendar Tool
The calendar tool provides the following features: , These buttons move the time frame forward or backward in time at an interval roughly equivalent to the current timeframe. For example, if the current timeframe encompasses about 1 week, clicking will advance the timeframe into the future one week. Increases the timeframe by 50 percent so that more test executions are included in the list. Decreases the timeframe by 50 percent so that fewer test executions are included in the list. Moves the selected timeframe backward or forward one day. Moves the selected timeframe backward or forward one week. Moves the selected timeframe backward or forward one month. Moves the selected timeframe backward or forward one quarter. Sets the past 7 days as the selected timeframe. Sets the past 31 days as the selected timeframe.

day week month quarter Last 7 days Last 31 days

Related Procedures Specifying a Calendar Range Related Reference Issues tab

722

Reports Unit Interface


This section contains information about the user interface elements in Test Manager's Reports unit. In This Section Report Properties tab The Properties tab lists basic properties of each report. Report Parameters tab The Parameters tab lists customizable statement elements. Report Data tab The Data tab serves as a read-only result preview that shows the result parameters and values of the selected report in tabular format. Report Chart tab The Chart tab enables you to define charts and graphs for data analysis. Report tab The Report tab is used to display data as a formatted report. Reports Toolbar Functions The Reports toolbar provides important commands for report management.

723

Report Properties tab


Test Manager Reports Properties The Properties tab lists basic properties of each report, enabling you to edit these properties or the report templates. You can also add subreports to your reports.
Item Description

Report name Report ID Description Created On Created By Changed On Changed By Renderer Default Tab Edit Add Subreport Report Templates

Name of the report (customizable) System-defined identifier of the report A description of the report (customizable) Date the report was created (default reports are created upon creation of and connection to a database) User who created the report (default reports are created by the user Admin) Date the report was last modified. User who last modified the report. Report template that is currently assigned to the report. Tab you are directed to when you select this report from one of the context-sensitive report lists. Click to open the Edit Report dialog box. Click to add a subreport to the report. The available pre-installed report templates are:
Report Template Download Type Description

Download Excel Report Template Download BIRT Report Template Download as CSV Download as XML

You receive an MS Excel file with a sheet named DATA that contains the data (for example, in CSV format). This is the only affected sheet in the template, so you can specify information in adjoining sheets (for example, diagrams). You receive the report data as a generic BIRT report template (empty). The datasource is already configured.

You receive the report data as a CSV file. Depending on your local settings, you will receive , or ; as the delimiter character. The date is also formatted based on user settings. You receive the report data as XML. The advantage of this approach over CSV is that you retain all subreport data. Accessing data outside of Test Manager - You can call a specific URL that offers the report data using the following format: http://server/servicesExchange? hid=reportData&userName=<username>&passWord=<password>&reportFilterID=<ID of the report>&type=<csv|xml>

724

Related Concepts Report Generation Related Procedures Creating New Reports Editing Report Properties Creating Reports Generating Reports Managing Reports

725

Report Parameters tab


Test Manager Reports Parameters The Parameters tab lists customizable statement elements. Parameters can be defined any time before a report execution by simply changing them on the Parameters tab. The syntax of a parameter is: ${parametername| defaultvalue|guiname} (defaultvalue and guiname are optional). Parameter names cannot contain whitespace characters. When a report has parameters associated with it, it is possible to edit the values of the parameters before each report execution. Parameter values are stored in the current user context (that is, edited values are available only to the user who performs the edits). When parameter values are not specified for a given report execution, the default values from the report definition are used. Using the Usage field, you can specify the usage type of a parameter. Possible values for this field are constant value, start time, and end time. start time and end time are used for reports that query for a specific date range. When a report has subreports assigned to it, the parameters of those subreports are also shown in the Parameters tab and the values are stored only within the context of the selected report (for example, the values are only used in conjunction with the current subreport configuration). When creating new reports, parameters are the values that are defined on the Create New Report dialog box in the Selection criteria area. Related Concepts Report Generation Related Procedures Editing Report Parameters Creating New Reports Creating Reports Generating Reports Managing Reports

726

Report Data tab


Test Manager Reports Data The Data tab serves as a read-only result preview that shows the result parameters and values of the selected report in tabular format. The first row contains meta information about the current report that comes from the report template. Check the Show header check box to display the additional meta properties for the report (Name, Project, Description, etc.) in tabular format in Data view. Note: Only the first 100 result rows are displayed by default. Alternate view options can be selected from the list box (Show all rows, Top 500 rows, Top 1000 rows). Related Concepts Report Generation Related Procedures Creating New Reports Creating Reports Generating Reports Managing Reports

727

Report Chart tab


Test Manager Reports Chart The Chart tab enables you to define charts and graphs for data analysis. The Chart tab relies on Test Managers internal reporting engine to create standard charts and graphs from the data retrieved by the selected report query. A number of standard chart types are available (area chart, bar chart, horizontal stacked bar chart, line chart, pie chart, and stacked bar chart). Four display properties are also configurable for each chart type. Related Concepts Report Generation Related Procedures Displaying Charts Creating Reports Generating Reports Managing Reports

728

Report tab
Test Manager Reports Report The Report tab is used to display data as a formatted report. If you have not yet assigned a template for your report, you can select one in the Report tab. A list box provides a selection of all available report templates. In addition to many system-installed templates, any custom report templates that may have been uploaded from Administration Reports Report Templates are also available here (see SilkCentral Administration Module documentation for details on setting up and uploading custom report templates). Alternatively, you can download an existing template by selecting the Properties tab, and then clicking the download link that corresponds to the report format you are working with (Excel, BIRT, CSV, or XML). From there you can customize the template to your needs. Note: Reports are cached to improve the performance of reporting. Click the Update button to update the report data immediately. Related Concepts Report Generation Related Procedures Creating New Reports Creating Reports Generating Reports Managing Reports

729

Reports Toolbar Functions


Test Manager Reports The Reports toolbar provides important commands for report management.
Item Description

New Child Folder

Enables creation of new report folders. Click New Child Folder to define a name and optional description for a new folder. The new folder displays as a child of the currently selected node in the Reports tree. New Child Report Enables creation of new reports. Click New Child Report to define a new report using the Create New Report dialog box. A new report displays as a child of the currently selected node in the Reports tree. Edit, Delete Edit and deletion of reports. Cut, Copy, Paste Cut, copy, and paste of reports within the Reports tree. Move Up, Move Down Move reports up or down within the Reports tree. Recently-Viewed Reports Lists MRU (Most Recently Used) reports by date/time in descending order. Select a report name from the list to advance to that report. Each time a report is accessed (by clicking the Data, Chart, or Report tab) that report is added to the top of the list box. Accessing a report's Properties or Parameters tab does not result in that report being added to the Recently-Viewed Reports list. The Recently-Viewed Reports list is empty for new users and users who have not yet generated a report. The number of reports that displays in this list can be configured by your administrator. See SilkCentral administrator help for details. Related Concepts Report Generation Related Procedures Accessing MRU (Most Recently Used) Reports Managing Reports

730

General Reference
This section contains general reference topics provided with SilkCentral Test Manager. In This Section HTML Support for Description Text Boxes Test Manager description text-boxes support HTML-formatted text. Multi-Select Functionality for Test Plan Elements The Contents page (Test Plan Contents) and the Steps page in the Test Plan unit (Test Plan Steps) support standard Windows Explorer style multi-select functionality for child test plan elements. SQL Functions for Custom Reports This table lists all available function placeholders.

731

HTML Support for Description Text Boxes


Test Manager description text boxes support HTML-formatted text. HTML-formatted descriptions from other applications (for example, Borland CaliberRM) and HTML code that is pasted directly into description text-boxes is also rendered by Test Manager. HTML formatting is offered through TinyMCE, a Java-based HTML editor. The TinyMCE editor is available for the description text boxes of the following Test Manager elements:

Requirements Test containers Test folders Test definitions Test steps (Description and Expected Results) Execution folders Execution definitions

Linking to External Web Pages and Images


Linking to External Web Pages and Images TinyMCE's Insert/Edit HTML link function enables you to specify external Web pages and images for inclusion in description text boxes. External Web pages are displayed in separate browser windows. Note: Web pages and images must be hosted on publicly available Web servers. You must specify a complete URL with a protocol when linking to external pages and images. Linked images can be resized and repositioned using TinyMCEs WYSIWYG functions. However, any changes that are made to images embedded in CaliberRM requirements will not be saved to the CaliberRM server when synchronizing. These changes will be lost. Images that are embedded in CaliberRM requirements can however be deleted. Note: For security reasons, TinyMCE does a cleanup when pasting HTML code into a description text box, and only allows HTML tags that it supports. This ensures that potentially unsafe scripts cannot be pasted into description text boxes. Note: When HTML descriptions are included in requirements that are used as the basis for reports, the HTML descriptions are rendered when displayed in the reports.

Available HTML Formatting Functions


Available HTML Formatting Functions TinyMCE offers the following well-known HTML-formatting commands:

Cut, Copy Paste, Paste as Plain Text Undo Redo

732

Text style, font, and font size settings Bold Italics Underline Text alignment (left, center, right, and justified) Numbered list Bullet list Indentation (decrease and increase) Font color Highlight color Insert/Edit HTML link, Remove HTML link Insert/Edit Image Edit HTML Source Clear Formatting
Related Procedures Creating New Reports Creating Requirements Attaching a File to a Requirement Adding Execution Definitions

733

Multi-Select Functionality for Test Plan Elements


The Contents page (Test Plan Contents) and the Steps page in the Test Plan unit (Test Plan standard Windows Explorer style multi-select functionality for child test plan elements. The following keyboard functions (shortcuts) are available:
Key Normal Shift Ctrl

Steps) support

Up

Move selection up

Extend selection up

Move up Move down

Down Move selection down Extend selection down Left Deselect

Right Deselect A X C V N Pos1 Select first item End Ins Del F2 Select last item Insert Delete Edit Select up to first item Select down to last item Select All Cut Copy Paste New (Steps tab only)

The following mouse and keyboard combination functions are also available. Following these functions, actions like cut, copy. or paste can be performed on selected nodes:

CLICK: Select a row and remember it as the current row. CTRL+CLICK: Toggle the selection status of the clicked row and remember it as the current row. SHIFT+CLICK: Select the span from the currently-selected row to a newly selected row. CTRL+SHIFT+CLICK: When a row is already selected, adds span from current row to the clicked row to the

selection. If current row is not selected, this function removes the span from the current row to the clicked row from the selection and selects the clicked row.
ALT+CLICK: When a manual test step is clicked, this opens the Edit dialog for the step (note that the displayed

selection in the background does not change until the dialog is closed). When pasting test steps, the steps are inserted into the list at the first selected row. When no steps are selected (CTRL+Click the last selected row to do this), steps are pasted to the end of the list.

Note: Containers cannot be copied or pasted.

734

Related Concepts Test Plan Tree Test Plan Management Related Procedures Copying, Pasting, and Deleting Test Plan Elements Managing Test Plans Related Reference Test Plan Contents Tab Test Plan Steps Page

735

SQL Functions for Custom Reports


To assist in writing advanced queries, placeholders are available for each function. Function placeholders are replaced with SQL code upon execution. Functions are used like parameters, but their names have a $ (dollar symbol) as a prefix. Unlike parameters, placeholders are defined report elements that cannot be customized per execution. The following table lists all available function placeholders:
Function What it does Example

$TODAY

Gives the current systemdate (on the database server). You can also write $TODAY-1 (for yesterday) or $TODAY-7 (for a week ago) Returns the date (does not include the time) Converts the given string to a database date

CreatedAt > ${$TODAY}

$DATE(column) $DATE('string') $DAYS[p1;p2]

CreatedAt > ${$DATE('01/10/2005')}

Calculates the difference in days between the two given ${$DAYS[CreatedAt;$TODAY]} > 7 parameters. The two parameters can be a column within (returns the rows created within the last week) the table/view or $TODAY. Returns the week-number of the given parameter, which can be $TODAY or a column. Returns the month of the year as a number of the given parameter, which can be $TODAY or a column. Returns the year as a number of the given parameter, which can be $TODAY or a column. The ID of the currently logged in user. The name of the currently logged in user. The ID of the currently selected project.

$WEEK(param) $MONTH(param) $YEAR(param) $USERID $USERNAME $PROJECTID

$PROJECTNAME The name of the currently selected project. $REPORTNAME $REPORTID The name of the currently selected report. The ID of the currently selected report.

Sample Custom Report Below is the code of the pre-installed Requirement with Child Requirements report. With this report, a selected requirement is shown with its requirement ID. Full details regarding the requirements child requirements are displayed. Although not a custom report, this report is a helpful example because it makes use of the $PROJECTID function. It also includes two parameters, reqID (requirement ID) and reqProp_Obsolete_0 (show obsolete requirements).

SELECT r.ReqID, r.ReqCreated, r.ReqName, r.TreeOrder FROM RTM_V_Requirements r INNER JOIN TM_ReqTreePaths rtp ON (rtp.ReqNodeID_pk_fk = r.ReqID) WHERE rtp.ParentNodeID_pk_fk=${reqID|22322|Requirement ID} AND r.ProjectID = ${$PROJECTID} AND r.MarkedAsObsolete=${reqProp_Obsolete_0|0|Show obsolete Requirements} ORDER BY r.TreeOrder ASC

736

Related Concepts Report Generation Related Procedures Creating New Reports Creating Reports Generating Reports Managing Reports

737

APIs
Test Manager Help Documentation Refer to the Test Manager API Help for full details regarding Test Manager's APIs. Refer to the Test Manager API Specification for full details regarding Test Manager's interfaces.

738

Database Schemas
Test Manager Help Documentation Refer to the Test Manager Database Model for full details regarding Test Manager's database schemas.

739

Index
activities page columns, displaying and hiding, 575 columns, reordering, 581 columns, resizing, 582 default settings, 583 filtering test runs, 577 filters, removing, 580 sorting test runs, 584 test runs, grouping, 579 agent computers, 37 all related issues reports, requirements, 160 APIs, 738 application server, 37 architecture, 36 overview, 36 attachments test plans, 246 432 433 deleting, 431 editing descriptions, 434 viewing, 435 attributes creating custom, 211 253 deleting custom, 254 editing custom, 255 automated tests converting manual tests, 100 executing, 234 531 average page time reports, performance trend, 178 average transaction busy time reports, performance trend, 179 Borland software quality, 34 browser settings, 41 builds information files, 570 calendar range, 558 CaliberRM baseline handling, 82 test definitions, 83 change notification, 57 disabling, 250 enabling, 214 251 changes viewing recent, 463 chart server, 37 charts displaying, 207 621 740

printing, 625 removing, 626 code analysis overview, 189 enabling, 190 execution definitions, 634 latest builds and versions, 192 packages, viewing, 636 reports, 208 635 results compilation, 193 code coverage trend reports, 175 code-change impact reports, 186 Concurrent Version System CVS, 63 configuring SilkTest plan test properties, 393 .Net Explorer test properties, 394 JUnit test properties, 395 manual test properties, 397 NUnit test properties, 398 SilkPerformer test properties, 399 SilkTest test properties, 400 Windows scripting test properties, 401 coverage modes, 69 378 custom measure reports, performance trend, 181 custom reports SQL functions, 151 736 data sources data-driven tests, 60 configuring Excel or CSV, 263 configuring JDBC, 261 deleting, 265 downloading Excel file from, 266 synchronizing, 267 uploading Excel files to, 268 data-driven test types, 91 data-driven tests downloading CSV data, 439 properties, 440 database server, 37 emails change notification, 75 execution dependencies, 119 execution definition run comparison reports, 168 execution definition run errors

reports, 172 execution definitions, 120 adding, 224 533 assigning test definitions, filter, 227 476 assigning test definitions, grid view, 226 473 assigning test definitions, manually, 225 475 copying, 534 data-driven, 547 deleting, 535 dependencies, deleting, 488 dependencies, editing, 489 dependent, adding, 230 486 details, viewing, 235 469 dynamic hardware provisioning, 120 editing, 536 locating test definitions, 471 removing test definition assignments, 472 results, deleting, 468 schedules, 124 228 493 497 schedules, global, 496 test runs, deleting, 467 testers, removing, 479 tree view, 545 updating, 459 upgrading from previous versions, 121 viewing assigned, 462 execution runs status, changing, 466 execution server, 37 external ID test packages, 97 external requirements management tools integration, 78 external tools manual tests, 101 filters global, 53 applying, 628 containers, folders, 456 creating, 220 244 630 creating advanced, 221 629 creating global, 212 270 deleting, 631 deleting global, 272 editing, 632 editing global, 273 folders copying, 562 adding, 569 child folders, pasting as, 567 cutting, 563 deleting, 564 editing, 565 pasting, 566 741

sorting, 568 front-end server, 36 glossary Test Manager, 46 grid view creating execution definitions, 445 474 columns, displaying and hiding, 446 columns, reordering, 452 columns, resizing, 453 455 default settings, 454 filters, removing, 451 test definitions, filtering, 447 test definitions, grouping, 449 test definitions, linking to, 450 help system typographical conventions, 27 welcome, 29 history test plans, 458 IBM Rational ClearQuest, 61 installing & licensing Test Manager, 30 Issue Manager SilkCentral Issue Manager, 31 integration, 142 issue tracking profiles, 61 adding Bugzilla, 289 adding IBM Rational ClearQuest, 294 adding SilkCentral Issue Manager, 278 adding StarTeam, 284 Bugzilla, 288 deleting profiles, 276 282 287 292 297 editing Bugzilla, 291 editing IBM Rational ClearQuest, 296 editing SilkCentral Issue Manager, 281 editing StarTeam, 286 IBM Rational ClearQuest, 293 mapping issue states, 280 285 290 295 SilkCentral Issue Manager, 277 StarTeam, 283 issues statistics, details view, 551 activities tab, 576 creating, 555 deleting, 557 external, 554 statistics, document view, 552 synchronizing internal and external, 559 issues per component reports, 185 JUnit tests editing, 420 keywords

reserved, 120 assigning, 232 481 creating, 483 folder execution, 121 removing, 484 virtual execution servers, 120 last executions deleting, 574 licensing access, 39 manual execution definitions, 128 manual testing step properties, custom, 56 Manual Testing Client, 135 attachments, 508 509 510 code analysis, 194 code analysis, enabling, 517 637 connection parameters, 503 executing tests, 518 execution definitions, 513 exporting execution packages, 520 installing, 521 internal issues, adding, 511 launching, 521 package build numbers, 514 package status, changing, 512 screengrabs, 507 settings, 504 test definitions, 524 test definitions, editing, 515 test results, uploading, 523 uninstalling, 521 upload preferences, 505 working offline, 525 manual tests reports, 165 aborting, 526 adding data source values, 437 adding testers, 480 automated tests, 442 custom step properties, creating, 215 257 custom step properties, deleting, 258 custom step properties, editing, 259 executing, 527 steps, editing, 443 manual tests, current run executing, 529 method coverage comparison reports, 176 Microsoft Office Import Tool, 72 Microsoft Visual SourceSafe

MSVSS, 63 NUnit tests editing, 421 obsolete requirements, 349 overall page time reports, performance trend, 183 overall transaction busy time reports, performance trend, 184 project management build information, 145 settings, 210 331 project overview reports, 154 projects CaliberRM, 367 selecting, 572 recent changes filters, 86 reports creating new, 149 bookmarking, 150 context-sensitive, 153 context-sensitive, execution, 601 context-sensitive, execution definition, 597 context-sensitive, requirements, 598 606 context-sensitive, test definitions, 599 context-sensitive, test plans, 610 creating, 199 587 603 607 611 customizing BIRT templates, 204 591 linking to queried data, 150 most recently viewed, 622 parameters, 202 623 PDF, viewing, 617 properties, 201 624 saving, 615 SQL queries, 203 589 605 609 613 subreports, adding, 205 619 subreports, deleting, 620 templates, downloading, 592 templates, removing, 614 templates, uploading, 616 viewing, 206 618 reports, document requirements, 159 reports, progress requirements, 158 test plans, 164 reports, status requirements, 157 test plans, 163 requirement history, 74 742

requirements properties, custom, 55 assign test definitions, 342 attaching files to, 219 335 attaching links to, 336 attachments, 68 attachments, deleting from, 337 attachments, editing descriptions, 338 attachments, viewing, 339 CaliberRM, 361 collapsing or expanding tree view, 377 creating, 217 341 creating child, 344 editing, 345 finding properties, 346 history, 354 IBM Rational RequisitePro, 362 integration, 58 integration, configuring custom properties, 356 integration, deleting custom properties, 357 integration, deleting property mapping, 371 integration, disabling, 372 integration, editing custom properties, 358 integration, editing external properties, 369 integration, editing property mapping, 373 integration, removing, 374 integration, synchronizing across tools, 375 integration, viewing external properties, 370 marking as obsolete, 349 removing test definition assignments, 350 replacing properties, 351 synchronizing, 79 synchronizing based on schedules, 375 375 Telelogic DOORS, 364 test coverage status, 70 test plans, 222 347 tree view, 67 types, 218 340 run comparison reports, 167 schedules exclusions, 124 definite runs, 124 definite runs, adding, 491 definite runs, deleting, 494 definite runs, editing, 495 exclusions, adding, 492 exclusions, deleting, 498 exclusions, editing, 499 schemas database, 739 Serena Version Manager PVCS, 63 setup and cleanup

test definitions, 125 setup and cleanup definitions execution definitions, 229 546 shortcuts multi-select, 734 SilkCentral Issue Manager, 61 SilkPerformer projects, 32 Performance Explorer, 33 SilkPerformer tests editing, 419 attended tests, 542 projects, downloading, 540 projects, opening, 543 properties, editing, 541 results, uploading, 544 test results, 538 539 SilkTest test definitions, 105 agent under test, 141 automated test execution, 140 data-driven tests, 139 logs, 137 test plans, 410 time-out settings, 138 SilkTest tests editing, 418 AUT host, adding, 478 source control profiles overview, 63 adding CVS, 310 adding MKS, 327 adding MSVSS, 314 adding PVCS, 305 adding StarTeam, 301 adding SVN, 319 adding UNC, 323 CVS, 309 deleting, 299 303 308 312 317 321 325 330 editing CVS, 311 editing MSVSS, 316 editing PVCS, 307 editing StarTeam, 302 editing SVN profiles, 320 329 editing UNC profiles, 324 MSVSS, 313 PVCS, 304 StarTeam, 300 SVN, 318 326 UNC, 322 StarTeam, 61 status calculation

743

test definitions, 121 Subversion SVN, 64 success conditions test plans, 93 editing, 422 test definition run comparison reports, 170 test definitions attributes, 54 assigning attributes, 242 386 assigning to requirements, 343 calculating status, 126 creating, 237 405 data-driven tests, 241 438 deleting attributes, 387 editing, 239 407 editing attributes, 388 executing trial runs, 408 find/replace properties, 424 Manual Testing Client, 102 parameters, 94 parameters, adding, 243 391 parameters, clearing, 392 parameters, creating custom, 402 parameters, editing, 390 requirements, 348 sorting on assigned tab, 353 Test definitions, 398 Test Manager logging in, 43 logging out, 44 Test Manager 8.0 reports, 155 test packages test plans, 96 creating, 240 404 test plans generating, 73 assigning requirements to test definitions, 245 381 containers, adding, 413 containers, adding links, 412 editing elements, 416 folders, adding, 415 locating assigned requirements, 382 management, 88 removing requirement assignments, 383 set default nodes, 429 sorting requirements, 384 test containers, editing, 427 test folders, modifying, 428 tree view, 89 tree view, expanding, 457 744

tour user interface, Test Manager, 24 user interface, Manual Testing Client, 129 Universal Naming Convention UNC, 64 updates build information, 146 Upload Manager issues, 108 using, 460 virtual configurations VMware Lab Manager, 118 What's New Test Manager, 20 Windows Script Host test types, 110 Java script sample, 113 log information, 112 parameter usage, 111 returning success information, 111 sample result file, 113 storing information in result file, 112 structure of output.xml, 112 supported script languages, 110 switches, 111 test properties, 110 VB script sample, 114 Windows Scripting Host tests editing, 423

Você também pode gostar