Escolar Documentos
Profissional Documentos
Cultura Documentos
Help
8310 N. Capital of Texas Hwy Bldg 2, Suite 100 Austin, TX 78731 USA www.borland.com Borland Software Corporation may have patents and/or pending patent applications covering subject matter in this document. Please refer to the product CD or the About dialog box for the list of applicable patents. The furnishing of this document does not give you any license to these patents. Copyright 2009 Borland Software Corporation and/or its subsidiaries. All Borland brand and product names are trademarks or registered trademarks of Borland Software Corporation in the United States and other countries. All other marks are the property of their respective owners. June 2009 PDF
Getting Started Concepts .................................................................................................................................................... Whats New in Borland SilkCentral Test Manager 2009 ..................................................................... Tour of the UI ...................................................................................................................................... Help on Help ....................................................................................................................................... Introduction to SilkCentral Test Manager .......................................................................................... Welcome to SilkCentral Test Manager ........................................................................................ Installing and Licensing Test Manager ........................................................................................ SilkCentral Issue Manager .......................................................................................................... Working With SilkPerformer Projects ........................................................................................... Working with Silk Performance Explorer ..................................................................................... Software Quality Optimization ..................................................................................................... SilkCentral Administration Module ............................................................................................... SilkCentral Architecture ........................................................................................................ Access and Licensing ........................................................................................................... Procedures ................................................................................................................................................. Configuring Browser Settings ............................................................................................................. Logging In and Out of Test Manager ................................................................................................. Logging into Test Manager .......................................................................................................... Logging out of Test Manager ....................................................................................................... Quick Start Tasks ....................................................................................................................................... Glossary ..................................................................................................................................................... 19 20 24 27 28 29 30 31 32 33 34 35 36 39 40 41 42 43 44 45 46
Concepts Successful Test Management .................................................................................................................... 51 Settings Configuration ........................................................................................................................ 52 Global Filters ................................................................................................................................ 53 Attributes ...................................................................................................................................... 54 Custom Requirement Properties ................................................................................................. 55 Custom Step Properties .............................................................................................................. 56 Change Notification ..................................................................................................................... 57 Requirements Integration Configuration ...................................................................................... 58 Data Sources for Data-Driven Tests ............................................................................................ 60 Issue Tracking Profiles ................................................................................................................ 61 Source Control Profiles ................................................................................................................ 63 Requirements Management ............................................................................................................... 66 Requirements Tree ...................................................................................................................... 67 Attachments ................................................................................................................................. 68 Full Coverage and Direct Coverage Modes ................................................................................ 69 Test Coverage Status .................................................................................................................. 70 Requirements Reports ................................................................................................................. 71 Microsoft Office Requirement-Import Tool ................................................................................... 72 Test Plan Generation ................................................................................................................... 73 Requirement History .................................................................................................................... 74 Change-Notification Emails ......................................................................................................... 75 External Requirements Management Tools ................................................................................ 77 External Requirements Management Tools ......................................................................... 78 Synchronizing Requirements ............................................................................................... 79 CaliberRM Integration with Test Manager ........................................................................... 81 Baseline Support for CaliberRM Integration .................................................................. 82 Test Definition Assignment Handling ............................................................................ 83 Filtering ............................................................................................................................................... 84 Filters ........................................................................................................................................... 85 Recent Changes .......................................................................................................................... 86 Test Plan Management ....................................................................................................................... 87 Test Plan Management ............................................................................................................... 88 Test Plan Tree ............................................................................................................................. 89 Test Plan Reports ........................................................................................................................ 90 Data-Driven Tests ........................................................................................................................ 91 Success Conditions ..................................................................................................................... 93 Test Definition Parameters .......................................................................................................... 94 Test Packages ............................................................................................................................. 96 Usage of External IDs .................................................................................................................. 97 Manual Tests ............................................................................................................................... 99 Converting Manual Tests to Automated Tests ................................................................... 100 Using External Tools to Create Manual Tests .................................................................... 101 Test Definitions in the Manual Testing Client ..................................................................... 102 SilkTest Test Plans .................................................................................................................... 104 SilkTest Test Definitions ..................................................................................................... 105 Test Definitions .......................................................................................................................... 107 Upload Manager ................................................................................................................. 108 Windows Script Host Tests ................................................................................................. 110 Test Definition Execution .................................................................................................................. 116 VMware Lab Manager Integration ............................................................................................. 117 VMware Lab Manager Virtual Configurations ..................................................................... 118 Execution Dependency Configuration ....................................................................................... 119 Execution Definitions ................................................................................................................. 120 4
Execution Definition Run Results Dialog ................................................................................... Execution Definition Schedules ................................................................................................. Setup and Cleanup Test Definitions .......................................................................................... Calculating the Test Definition Status ........................................................................................ Manual Test Definitions ............................................................................................................. Manual Test Execution ....................................................................................................... Tour of the Manual Testing Client UI .................................................................................. Manual Testing Client ......................................................................................................... SilkTest Tests ............................................................................................................................ SilkTest Logs ...................................................................................................................... SilkTest Time-out Settings .................................................................................................. Automated Execution of Data-Driven SilkTest Testcases .................................................. Automated Execution of SilkTest Test Definitions .............................................................. Specifying Agent Under Test (AUT) ................................................................................... Issue Management ........................................................................................................................... Project Management ......................................................................................................................... Build Information ........................................................................................................................ Build Information Updates ......................................................................................................... Report Generation ............................................................................................................................ New Report Creation ................................................................................................................. New Reports ....................................................................................................................... SQL Functions for Custom Reports ................................................................................... Context-Sensitive Reports ......................................................................................................... Project Overview Report ............................................................................................................ Test Manager 8.0 Reports ......................................................................................................... Requirements Reports ............................................................................................................... Status Reports .................................................................................................................... Progress Reports ................................................................................................................ Document Reports ............................................................................................................. All Related Issues Report ................................................................................................... Test Plan Reports ...................................................................................................................... Status Reports .................................................................................................................... Progress Reports ................................................................................................................ Manual Test Reports .......................................................................................................... Execution Reports ..................................................................................................................... Run Comparison Reports ................................................................................................... Execution Definition Run Comparison Reports .................................................................. Test Definition Run Comparison Report ............................................................................. Execution Definition Run Errors Report ............................................................................. Code Coverage Reports ............................................................................................................ Code Coverage Trend Report ............................................................................................ Method Coverage Comparison Report ............................................................................... Performance Trend Reports ...................................................................................................... Average Page-Time Trend Report ..................................................................................... Average Transaction Busy-Time Trend Report .................................................................. Custom Measure Trend Report .......................................................................................... Overall Page-Time Trend Report ....................................................................................... Overall Transaction Busy-Time Trend Report .................................................................... Issues Per Component Report .................................................................................................. Code-Change Impact Reports ................................................................................................... Code Coverage Analysis .................................................................................................................. Test Manager Code Analysis .................................................................................................... Enabling Code Analysis for SilkCentral Test Manager .............................................................. Latest Builds and Build Versions ............................................................................................... Results Compilation ...................................................................................................................
123 124 125 126 127 128 129 135 136 137 138 139 140 141 142 144 145 146 147 148 149 151 153 154 155 156 157 158 159 160 162 163 164 165 166 167 168 170 172 174 175 176 177 178 179 181 183 184 185 186 188 189 190 192 193
Procedures Quick Start Tasks ..................................................................................................................................... Analyzing Test Results - Quick Start Task ....................................................................................... Creating New Reports ............................................................................................................... Editing Report Properties .......................................................................................................... Editing Report Parameters ........................................................................................................ Writing Advanced Queries with SQL ......................................................................................... Customizing BIRT Report Templates ........................................................................................ Adding Subreports ..................................................................................................................... Viewing Reports ........................................................................................................................ Displaying Charts ...................................................................................................................... Generating Code-Change Impact Reports ................................................................................ Configuring Projects - Quick Start Task ............................................................................................ Configuring Project Settings ...................................................................................................... Creating Custom Attributes ....................................................................................................... Creating Global Filters ............................................................................................................... Enabling Change Notification .................................................................................................... Creating Custom Step Properties .............................................................................................. Managing Requirements - Quick Start Task ..................................................................................... Creating Requirements .............................................................................................................. Configuring Requirement Types ................................................................................................ Attaching a File to a Requirement ............................................................................................. Creating Filters .......................................................................................................................... Creating Advanced Filters ......................................................................................................... Generating Test Plans from Requirements View ...................................................................... Managing Test Executions - Quick Start Task .................................................................................. Adding Execution Definitions ..................................................................................................... Manually Assigning Test Definitions to Execution Definitions ................................................... Assign Test Definitions from Grid View to Execution Definitions ............................................... Using a Filter to Assign Test Definitions to Execution Definitions ............................................. Creating a Custom Schedule for an Execution Definition .......................................................... Configuring Setup and Cleanup Executions .............................................................................. Adding Dependent Execution Definitions .................................................................................. Assigning Keywords to Execution Definitions ............................................................................ Executing Individual Tests ......................................................................................................... Viewing Test Execution Details ................................................................................................. Managing Test Plans - Quick Start Task .......................................................................................... Creating Test Definitions ........................................................................................................... Editing Test Definitions .............................................................................................................. Creating a Test Package ........................................................................................................... Creating Data-Driven Test Definitions ....................................................................................... Assigning Attributes to Test Definitions ..................................................................................... Adding Predefined Parameters to Test Definitions .................................................................... Creating Filters .......................................................................................................................... Assigning Requirements to Test Definitions .............................................................................. Attaching Files to Test Plan Elements ....................................................................................... Managing a Successful Test .................................................................................................................... Configuring Test Manager Settings .................................................................................................. Configuring Change Notification ................................................................................................ Disabling Change Notification ............................................................................................ Enabling Change Notification ............................................................................................. Configuring Custom Attributes ................................................................................................... Creating Custom Attributes ................................................................................................ Deleting Custom Attributes ................................................................................................. 7 197 198 199 201 202 203 204 205 206 207 208 209 210 211 212 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 232 234 235 236 237 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254
Editing Custom Attributes ................................................................................................... Configuring Custom Step Properties ......................................................................................... Creating Custom Step Properties ....................................................................................... Deleting Custom Step Properties ....................................................................................... Editing Custom Step Properties ......................................................................................... Configuring Data Sources for Data-Driven Tests ...................................................................... Configuring JDBC Data Sources ........................................................................................ Configuring Microsoft Excel or CSV Data Sources ............................................................ Deleting Data Sources ....................................................................................................... Downloading Excel Files from a Data Source .................................................................... Synchronizing Data Sources .............................................................................................. Uploading Updated Excel Files to a Data Source .............................................................. Configuring Global Filters .......................................................................................................... Creating Global Filters ........................................................................................................ Deleting Global Filters ........................................................................................................ Editing Global Filters .......................................................................................................... Configuring Issue Tracking Profiles ........................................................................................... Deleting Issue Tracking Profiles ......................................................................................... Managing SilkCentral Issue Manager Issue Tracking Profiles ........................................... Adding SilkCentral Issue Manager Issue Tracking Profiles ........................................ Mapping Issue States .................................................................................................. Editing SilkCentral Issue Manager Issue Tracking Profiles ......................................... Deleting Issue Tracking Profiles .................................................................................. Managing Borland StarTeam Issue Tracking Profiles ........................................................ Adding Borland StarTeam Issue Tracking Profiles ..................................................... Mapping Issue States .................................................................................................. Editing Borland StarTeam Issue Tracking Profiles ...................................................... Deleting Issue Tracking Profiles .................................................................................. Managing Bugzilla Issue Tracking Profiles ......................................................................... Adding Bugzilla Issue Tracking Profiles ...................................................................... Mapping Issue States .................................................................................................. Editing Bugzilla Issue Tracking Profiles ...................................................................... Deleting Issue Tracking Profiles .................................................................................. Managing IBM Rational ClearQuest Issue Tracking Profiles ........................................ Adding IBM Rational ClearQuest Issue Tracking Profiles ........................................... Mapping Issue States .................................................................................................. Editing IBM Rational ClearQuest Issue Tracking Profiles ........................................... Deleting Issue Tracking Profiles .................................................................................. Configuring Source Control Profiles .......................................................................................... Deleting Source Control Profiles ........................................................................................ Managing Borland StarTeam Source Control Profiles ....................................................... Adding StarTeam Source Control Profiles .................................................................. Editing StarTeam Source Control Profiles .................................................................. Deleting Source Control Profiles ................................................................................. Managing Serena Version Manager (PVCS) Profiles ........................................................ Adding PVCS Source Control Profiles ........................................................................ Editing PVCS Source Control Profiles ........................................................................ Deleting Source Control Profiles ................................................................................. Managing CVS Profiles ...................................................................................................... Adding CVS Source Control Profiles ........................................................................... Editing CVS Source Control Profiles ........................................................................... Deleting Source Control Profiles ................................................................................. Managing Microsoft Visual SourceSafe (MSVSS) Profiles ................................................ Adding MSVSS Source Control Profiles ..................................................................... Editing MSVSS Source Control Profiles ......................................................................
255 256 257 258 259 260 261 263 265 266 267 268 269 270 272 273 275 276 277 278 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 307 308 309 310 311 312 313 314 316
Deleting Source Control Profiles ................................................................................. Managing Subversion Profiles ............................................................................................ Adding Subversion Source Control Profiles ................................................................ Editing Subversion Source Control Profiles ................................................................ Deleting Source Control Profiles ................................................................................. Managing UNC Profiles ...................................................................................................... Adding UNC Source Control Profiles .......................................................................... Editing UNC Source Control Profiles .......................................................................... Deleting Source Control Profiles ................................................................................. Managing VFS Profiles ....................................................................................................... Adding VFS Source Control Profiles ........................................................................... Editing VFS Source Control Profiles ........................................................................... Deleting Source Control Profiles ................................................................................. Configuring Project Settings ...................................................................................................... Managing Requirements ................................................................................................................... Creating Requirements .............................................................................................................. Managing Requirement Attachments ................................................................................. Attaching a File to a Requirement ............................................................................... Attaching a Link to a Requirement .............................................................................. Deleting a Requirement Attachment ........................................................................... Editing a Requirement Attachment Description .......................................................... Viewing a Requirement Attachment ............................................................................ Configuring Requirement Types ......................................................................................... Creating Requirements ...................................................................................................... Assigning Test Definitions from Grid View to Requirements .............................................. Assigning Test Definitions to Requirements Manually ....................................................... Creating Child Requirements ............................................................................................. Editing Requirements ......................................................................................................... Finding Requirement Properties ......................................................................................... Generating Test Plans from Requirements View ............................................................... Locating Assigned Test Definitions in the Test Plan Tree .................................................. Marking Requirements as Obsolete ................................................................................... Removing Test Definition Assignments .............................................................................. Replacing Requirement Properties .................................................................................... Sorting the Assigned Test Definitions Tab ......................................................................... Tracking the History of a Requirement ............................................................................... Customizing Requirement Properties ........................................................................................ Configuring Custom Requirement Properties ..................................................................... Deleting Custom Requirement Properties .......................................................................... Editing Custom Requirement Properties ............................................................................ Integrating External RM Tools ................................................................................................... Enabling External Requirements Management Integration ................................................ Enabling Integration with Borland CaliberRM ............................................................. Enabling Integration with IBM Rational RequisitePro .................................................. Enabling Integration with Telelogic DOORS ............................................................... Working with CaliberRM ..................................................................................................... Copying CaliberRM-Integrated Projects ...................................................................... Working with External Properties ....................................................................................... Editing External Properties .......................................................................................... Viewing External Properties ........................................................................................ Deleting Property-Mapping Value Pairs ............................................................................. Disabling Requirements-Management Integration ............................................................. Editing Property Mapping ................................................................................................... Removing Requirements-Management Integration ............................................................ Synchronizing Requirements Across Tools ........................................................................
317 318 319 320 321 322 323 324 325 326 327 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 353 354 355 356 357 358 359 360 361 362 364 366 367 368 369 370 371 372 373 374 375
Collapsing or Expanding the Requirements tree ....................................................................... Switching Between Full and Direct Coverage Modes ................................................................ Managing Test Plans ........................................................................................................................ Associating Requirements with Test Definitions ........................................................................ Assigning Requirements to Test Definitions ....................................................................... Locating Assigned Requirements ....................................................................................... Removing Requirement Assignments ................................................................................ Sorting Requirements ......................................................................................................... Configuring Test Definition Attributes ........................................................................................ Assigning Attributes to Test Definitions .............................................................................. Deleting Attributes from Test Definitions ............................................................................ Editing Test Definition Attributes ........................................................................................ Configuring Test Definition Parameters ..................................................................................... Editing Predefined Parameters .......................................................................................... Adding Predefined Parameters to Test Definitions ............................................................ Clearing Predefined Parameter Assignments .................................................................... Configuring SilkTest Plan Properties .................................................................................. Configuring .Net Explorer Test Properties .......................................................................... Configuring JUnit Test Properties ...................................................................................... Configuring Manual Test Properties ................................................................................... Configuring NUnit Test Properties ...................................................................................... Configuring SilkPerformer Test Properties ......................................................................... Configuring SilkTest Test Properties .................................................................................. Configuring Windows Scripting Test Properties ................................................................. Creating Custom Parameters ............................................................................................. Creating Test Definitions ........................................................................................................... Creating a Test Package .................................................................................................... Creating Test Definitions .................................................................................................... Editing Test Definitions ....................................................................................................... Executing a Trial Run of a Test Definition .......................................................................... Creating Test Plans ................................................................................................................... Importing SilkTest Test Plans ............................................................................................. Editing Test Plan Elements ....................................................................................................... Adding Links to Containers ................................................................................................ Adding Test Containers ...................................................................................................... Adding Test Folders ........................................................................................................... Copying, Pasting, and Deleting Test Plan Elements .......................................................... Editing SilkTest Tests ......................................................................................................... Editing SilkPerformer Tests ................................................................................................ Editing JUnit Tests .............................................................................................................. Editing NUnit Tests ............................................................................................................. Editing Success Conditions ................................................................................................ Editing Windows Scripting Host Tests ................................................................................ Finding and Replacing Test Definition Properties .............................................................. Modifying Test Containers .................................................................................................. Modifying Test Folders ....................................................................................................... Set a Test Plan Node as Integration Default for External Agile Planning Tools ................. Working with Attachments ......................................................................................................... Deleting Attachments from Test Plan Elements ................................................................. Attaching Files to Test Plan Elements ................................................................................ Attaching Links to Test Plan Elements ............................................................................... Editing Attachment Descriptions ........................................................................................ Viewing Test Plan Attachments .......................................................................................... Working with Data-Driven Tests ................................................................................................ Adding a Data Source Value to a Manual Test Step ..........................................................
377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 397 398 399 400 401 402 403 404 405 407 408 409 410 411 412 413 415 416 418 419 420 421 422 423 424 427 428 429 430 431 432 433 434 435 436 437
10
Creating Data-Driven Test Definitions ................................................................................ Downloading CSV Data From a Data Source .................................................................... Editing Data-Driven Properties ........................................................................................... Working with Manual Tests ....................................................................................................... Converting Manual Test Definitions to Automated Tests ................................................... Editing Manual Test Steps From Within Test Manager ...................................................... Working With Test Definitions in Grid View ............................................................................... Creating an Execution Definition in Grid View .................................................................... Displaying/Hiding Columns in Grid View ............................................................................ Filtering Test Definitions in Grid View ................................................................................ Grouping Test Definitions in Grid View .............................................................................. Linking to Test Definitions from Grid View ......................................................................... Removing Grid View Filters ................................................................................................ Reordering Columns in Grid View ...................................................................................... Resizing Columns in Grid View .......................................................................................... Restoring Default Grid View Settings ................................................................................. Sorting Test Definitions in Grid View .................................................................................. Creating a Filter for a Folder or Container ................................................................................. Expanding/Collapsing the Test Plan tree .................................................................................. Tracking Test Plan History ......................................................................................................... Updating Execution Definitions .................................................................................................. Using Upload Manager .............................................................................................................. Viewing Assigned Executions .................................................................................................... Viewing Recent Changes .......................................................................................................... Executing Test Definitions ................................................................................................................ Analyzing Test Runs .................................................................................................................. Changing the Status of a Test Execution Run ................................................................... Deleting Individual Test Run Results ................................................................................. Deleting the Results of an Execution Definition ................................................................. Viewing Test Execution Details .......................................................................................... Assigning Test Definitions to Execution Definitions ................................................................... Locating Test Definitions Assigned to Execution Definitions .............................................. Removing Test Definition Assignments .............................................................................. Assign Test Definitions from Grid View to Execution Definitions ....................................... Creating an Execution Definition in Grid View .................................................................... Manually Assigning Test Definitions to Execution Definitions ............................................ Using a Filter to Assign Test Definitions to Execution Definitions ...................................... Configuring Deployment Environments ..................................................................................... Adding a SilkTest AUT Host ............................................................................................... Removing a Tester Assignment from an Execution Definition ........................................... Adding Manual Testers ...................................................................................................... Assigning Keywords to Execution Definitions .................................................................... Creating New Keywords ..................................................................................................... Removing Keywords from Execution Definitions ................................................................ Configuring Execution Dependencies ....................................................................................... Adding Dependent Execution Definitions ........................................................................... Deleting a Dependency ...................................................................................................... Editing a Dependency ........................................................................................................ Defining Execution Definition Schedules ................................................................................... Adding Definite Runs .......................................................................................................... Adding Exclusions .............................................................................................................. Creating a Custom Schedule for an Execution Definition .................................................. Deleting Definite Runs ........................................................................................................ Editing Definite Runs .......................................................................................................... Specifying Global Schedules for Execution Definitions ......................................................
438 439 440 441 442 443 444 445 446 447 449 450 451 452 453 454 455 456 457 458 459 460 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479 480 481 483 484 485 486 488 489 490 491 492 493 494 495 496
11
Specifying No Schedule for Execution Definitions ............................................................ Deleting Exclusions ............................................................................................................ Editing Exclusions .............................................................................................................. Executing Manual Tests ............................................................................................................ Using the Manual Testing Client ....................................................................................... Configuring the Manual Testing Client ....................................................................... Configuring Connection Parameters .................................................................... Configuring Other Settings ................................................................................... Configuring Package Upload Preferences ........................................................... Managing Attachments with the Manual Testing Client ............................................. Pasting Screen Captures ..................................................................................... Uploading Attachments to the Manual Testing Client .......................................... Viewing Attached Images Within the Manual Testing Client ................................ Viewing Attachments Within the Manual Testing Client ....................................... Adding an Internal Issue with the Manual Testing Client ............................................ Changing a Test Definitions Status ............................................................................ Downloading Execution Definition Packages .............................................................. Editing Package Build Numbers .................................................................................. Editing Test Definitions Within the Manual Testing Client ........................................... Enabling Code Analysis Within the Manual Testing Client ......................................... Executing Manual Tests with the Manual Testing Client ............................................. Exporting and Importing Execution Packages ............................................................ Installing SilkCentral Manual Testing Client ................................................................ Uploading Test Results to Test Manager .................................................................... Viewing and Editing Test Definitions in Test Manager ................................................ Working Offline with the Manual Testing Client .......................................................... Aborting Manual Test Executions ....................................................................................... Executing Manual Tests ..................................................................................................... Executing Manual Tests in the Current Run Page ............................................................. Running Automated Tests ......................................................................................................... Executing Individual Tests .................................................................................................. Working with Execution Definitions ........................................................................................... Adding Execution Definitions .............................................................................................. Copying Execution Definitions ............................................................................................ Deleting Execution Definitions ............................................................................................ Editing Execution Definitions .............................................................................................. Working with SilkPerformer Projects ......................................................................................... Analyzing SilkPerformer Test Results ................................................................................ Downloading SilkPerformer Test Result Packages ............................................................ Downloading SilkPerformer Projects .................................................................................. Editing SilkPerformer Test Properties ................................................................................ Executing Attended SilkPerformer Tests ............................................................................ Opening SilkPerformer Projects ......................................................................................... Uploading SilkPerformer Test Results ............................................................................... Collapsing or Expanding the Execution Tree ............................................................................ Configuring Setup and Cleanup Executions .............................................................................. Creating Data-Driven Execution Definitions .............................................................................. Managing Issues ............................................................................................................................... Tracking Issues .......................................................................................................................... Viewing Issue Statistics in Details View ............................................................................. Viewing Issue Statistics in Document View ........................................................................ Working with Issues ................................................................................................................... Assigning External Issues .................................................................................................. Creating New Issues .......................................................................................................... Deleting Issues ...................................................................................................................
497 498 499 500 501 502 503 504 505 506 507 508 509 510 511 512 513 514 515 517 518 520 521 523 524 525 526 527 529 530 531 532 533 534 535 536 537 538 539 540 541 542 543 544 545 546 547 549 550 551 552 553 554 555 557
12
Specifying a Calendar Range ............................................................................................. Synchronizing Internal/External Issue States ..................................................................... Managing Projects ............................................................................................................................ Managing Folders ...................................................................................................................... Copying Folders ................................................................................................................. Cutting Folders ................................................................................................................... Deleting Folders ................................................................................................................. Editing Folders ................................................................................................................... Pasting Folders .................................................................................................................. Pasting Folders as Child Folders ....................................................................................... Sorting Folders ................................................................................................................... Adding Folders ................................................................................................................... Creating Build Information Files ................................................................................................ Selecting Projects ...................................................................................................................... Managing Activities ........................................................................................................................... Deleting Last Executions Runs ................................................................................................ Displaying/Hiding Columns on the Activities Page .................................................................... Entering Issues From the Activities Tab .................................................................................... Filtering Test Runs on the Activities Page ................................................................................. Grouping Test Runs on the Activities Page ............................................................................... Removing Activities Filters ......................................................................................................... Reordering Columns on the Activities Page .............................................................................. Resizing Columns on the Activities Page .................................................................................. Restoring Default Activities Page View Settings ....................................................................... Sorting Test Runs on the Activities Page .................................................................................. Managing Reports ............................................................................................................................ Creating Reports ....................................................................................................................... Creating New Reports ........................................................................................................ Writing Advanced Queries with SQL .................................................................................. Customizing Reports with BIRT ................................................................................................. Customizing BIRT Report Templates ................................................................................. Downloading Report Templates ......................................................................................... Generating Reports ................................................................................................................... Using Context-Sensitive Reports ....................................................................................... Accessing Context-Sensitive Reports ......................................................................... Accessing Context-Sensitive Execution Reports ................................................. Accessing Context-Sensitive Requirements Reports .......................................... Accessing Context-Sensitive Test-Definition Reports .......................................... Enabling Context-Sensitive Reports ........................................................................... Enabling Context-Sensitive Execution Reports ................................................... Creating New Reports ................................................................................... Writing Advanced Queries with SQL ............................................................ Enabling Context-Sensitive Requirements Reports ............................................. Creating New Reports ................................................................................... Writing Advanced Queries with SQL ............................................................ Enabling Context-Sensitive Test-Plan Reports .................................................... Creating New Reports ................................................................................... Writing Advanced Queries with SQL ............................................................ Removing Report Templates .............................................................................................. Saving Reports ................................................................................................................... Uploading Report Templates .............................................................................................. Viewing a Report as a PDF ................................................................................................ Viewing Reports ................................................................................................................. Adding Subreports ..................................................................................................................... Deleting Subreports ...................................................................................................................
558 559 560 561 562 563 564 565 566 567 568 569 570 572 573 574 575 576 577 579 580 581 582 583 584 585 586 587 589 590 591 592 594 595 596 597 598 599 600 601 603 605 606 607 609 610 611 613 614 615 616 617 618 619 620
13
Displaying Charts ...................................................................................................................... Accessing MRU (Most Recently Used) Reports ........................................................................ Editing Report Parameters ........................................................................................................ Editing Report Properties .......................................................................................................... Printing Charts ........................................................................................................................... Removing Charts ....................................................................................................................... Working with Filters ........................................................................................................................... Applying Filters .......................................................................................................................... Creating Advanced Filters ......................................................................................................... Creating Filters .......................................................................................................................... Deleting Filters ........................................................................................................................... Editing Filters ............................................................................................................................. Analyzing Code Coverage ................................................................................................................ Enabling Code Analysis for Execution Definitions ..................................................................... Generating Code-Change Impact Reports ................................................................................ Viewing Code-Coverage Information for Packages ................................................................... Enabling Code Analysis Within the Manual Testing Client ........................................................
621 622 623 624 625 626 627 628 629 630 631 632 633 634 635 636 637
14
Reference User Interface Reference ......................................................................................................................... Projects Unit Interface ...................................................................................................................... Projects tab ................................................................................................................................ Overview tab .............................................................................................................................. Activities Page ........................................................................................................................... Cross-Project Activities Page .................................................................................................... Test Definition Run Results Dialog ............................................................................................ Settings Unit Interface ...................................................................................................................... Project Settings tab ................................................................................................................... Filters tab ................................................................................................................................... Attributes tab ............................................................................................................................. Requirement Properties Page ................................................................................................... Step Properties Page ................................................................................................................ Notifications Page ...................................................................................................................... Integrations Configuration tab ................................................................................................... Data Sources Configuration Page ............................................................................................. Issue Tracking Profiles Page ..................................................................................................... Source Control Profiles Page .................................................................................................... Requirements Unit Interface ............................................................................................................. Requirements Document View .................................................................................................. Requirements Toolbar Functions .............................................................................................. Requirement Properties tab ....................................................................................................... Requirement Attachments tab ................................................................................................... Assigned Test Definitions tab .................................................................................................... Requirement Coverage tab ....................................................................................................... Requirement History tab ............................................................................................................ Test Plan Unit Interface .................................................................................................................... Test Plan Document View ......................................................................................................... Test Plan Grid View ................................................................................................................... Test Plan Properties tab ............................................................................................................ Test Plan Steps Page ................................................................................................................ Test Plan Contents Tab ............................................................................................................. Test Plan Attributes tab ............................................................................................................. Test Plan Parameters tab .......................................................................................................... Test Plan Assigned Requirements tab ...................................................................................... Test Plan Attachments tab ........................................................................................................ Test Plan Assigned Executions tab ........................................................................................... Test Plan Runs tab .................................................................................................................... Test Plan Issues Page ............................................................................................................... Test Plan History tab ................................................................................................................. Test Plan Data Set tab .............................................................................................................. Test Plan Toolbar Functions ...................................................................................................... Test Definition Run Results Dialog ............................................................................................ Execution Unit Interface .................................................................................................................... Execution Document View ......................................................................................................... Execution Properties tab ........................................................................................................... Execution Assigned Test Definitions Tab .................................................................................. Execution Setup/Cleanup tab .................................................................................................... Execution Schedule tab ............................................................................................................. Execution Deployment tab ......................................................................................................... Execution Dependencies tab ..................................................................................................... Execution Notifications Page ..................................................................................................... Execution Runs Tab .................................................................................................................. 15 641 642 643 644 645 650 651 653 654 655 656 657 658 659 660 662 663 664 665 666 667 668 669 670 671 672 673 674 675 676 677 678 679 680 681 682 683 684 685 686 687 688 689 691 692 693 695 697 698 700 702 703 704
Current Run Page ...................................................................................................................... Run Dialog ................................................................................................................................. Execute Test Dialog Box ........................................................................................................... Test Definition Run Results Dialog ............................................................................................ Code Analysis Unit Interface ............................................................................................................ Code Analysis Details tab .......................................................................................................... Select Classes for Report Dialog ............................................................................................... Issues Unit Interface ......................................................................................................................... Issues Document View .............................................................................................................. Issues tab .................................................................................................................................. Calendar Tool ............................................................................................................................ Reports Unit Interface ....................................................................................................................... Report Properties tab ................................................................................................................ Report Parameters tab .............................................................................................................. Report Data tab ......................................................................................................................... Report Chart tab ........................................................................................................................ Report tab .................................................................................................................................. Reports Toolbar Functions ........................................................................................................ General Reference ................................................................................................................................... HTML Support for Description Text Boxes ....................................................................................... Multi-Select Functionality for Test Plan Elements ............................................................................ SQL Functions for Custom Reports .................................................................................................. APIs .......................................................................................................................................................... Database Schemas ..................................................................................................................................
706 709 710 713 715 716 718 719 720 721 722 723 724 726 727 728 729 730 731 732 734 736 738 739
16
17
Getting Started
This section explains the concepts and procedures related to getting started using Test Manager. In This Section Concepts This section explains the concepts that are required for getting started with Test Manager. Procedures This section explains the tasks that must be performed before you can begin using SilkCentral Test Manager. Quick Start Tasks Quick Start Tasks are high-level overviews of the main tasks that you will likely need to perform with SilkCentral Test Manager. Glossary Guide to Test Manager terminology.
18
Concepts
This section explains the concepts that are required for getting started with Test Manager. In This Section Whats New in Borland SilkCentral Test Manager 2009 New functionality and updates in SilkCentral Test Manager 2009. Tour of the UI An overview of the main elements of Test Manager's user interface. Help on Help How information is organized in Test Manager Help. Introduction to SilkCentral Test Manager Test Manager is a complete testing solution, from requirements and test-plan management, to test-execution management, code-coverage analysis, issue tracking, and reporting.
19
Usability Enhancements
Usability enhancements for Agile teams have been made throughout SilkCentral Test Manager. New Manual Test Web UI (Current Run Page) The new Current Run page facilitates the administration and execution of manual tests by Agile teams in Test Manager's GUI. All information related to the current manual execution is consolidated onto a single page acting like a list of test tasks. This approach provides each Agile team member with an overview of the tests that are finished, in progress, and not yet executed. The former Manual Test Detail view, Step-by-Step view, and Step-by-Step Only view are consolidated onto a single page. Three separate grids on the page show detailed information related to the active execution definition run, the assigned test definitions, and the test steps. The new page enables you to easily keep test-management data updated with changing testing needs. For example, the page displays additionally assigned test definitions, changes to test definition specifications, and changes to test steps. Such changes can be entered directly in the Current Run page. For enhanced usability the Current Run page offers standard keyboard support for easy navigation and grid functionality for customizing and interacting with the displayed data. The Manual Testing Client is still available and remains the best choice for traditional manual testing for testers working with assigned packages of manual test definitions. Note: For automated tests, the Current Run page shows the progress of executions. Seamless Automation of Manual Test Definitions Going Agile typically involves increased test automation. Test Manager supports automation efforts with seamless transformation of manual test definitions into automated test definitions. All test results and historical data are retained after transformation.
Integration Enhancements
Test Manager now integrates with Agile management tools. Agile Project Template SilkCentral Test Manager provides an Agile project template that facilitates integration with Agile management tools, for example VersionOne. This template also suggests a possible workflow for using SilkCentral Test Manager in your Agile software development environment. VersionOne VersionOne is a leading Agile management tool. It allows you to manage your user stories in an Agile way. Its integration with Test Manager extends its user-story management capabilities with a powerful testing component that brings both test result and status information to VersionOne. This integration enables you to stay up-to-date with the status of your user stories. Test Manager supports both editions of VersionOne, Agile Enterprise and Agile
20
Team. To integrate VersionOne with a Japanese Test Manager, change the start options of the Application Server service in the registry to -Dfile.encoding=utf-8. Set Integration Default Node for Agile Planning Tools You can now set a folder or container in the test plan tree as the integration default node where you can create tests through a Web Service call from an external Agile planning tool.
Usability Enhancements
Usability enhancements have been made throughout SilkCentral Test Manager. Master/Detail Grid Views of Execution Definition Runs and Test Definition Run Test Manager Execution Runs The new grid views on the Runs page offer view settings, including resizing and reordering of columns, filtering, sorting, and grouping options that are configurable on a per-user basis. You can display or hide columns, adjust the width of columns, and move columns around by clicking on a column and dragging it to the desired location. You can use the keyboard to navigate through the runs. The page is now split into two separate sections, one listing the execution definition runs and a second listing the test definition runs that are included in the selected execution definition run. Additional Triggering Information for Execution Definition Runs SilkCentral Test Manager provides additional information related to how execution definition runs are started, for example through the Web or through a schedule. Email Notification of Finished Execution Definition Runs You can now configure email notification for finished execution definition runs of specific execution definitions. Notifications vary based on the result of the execution definition run. Enhanced Requirement Synchronization The time involved in synchronizing CaliberRM requirements has been dramatically reduced. SilkCentral Administration Module now uses enhanced CaliberRM functionality to return only those requirements that include changes. It no longer needs to check through all requirements to identify changes. Project-Context Management of Source Control and Issue Tracking Profiles Source control profiles and issue tracking profiles are now maintained independently for each project in Test Settings . Manager Test-Overload Restriction When a new execution definition run is triggered through a schedule or dependency, Test Manager now skips this new run if another run of the same execution definition, also triggered by a schedule, is currently executed. Test Manager then writes a warning to the application server logfile. This restriction prevents an overload of Test Manager with large amounts of current runs when schedule intervals are too short, especially for schedules on the folder level.
Integration Enhancements
SilkCentral Test Manager has been enhanced to better integrate with other applications.
21
Access to Execution-Definition Run Properties You can now get Test Manager execution definition run properties, for example the name or description of the execution definition, during the execution of SilkPerformer tests. Use the AttributeGet method to access the properties in SilkPerformer scripts. Enhanced VMware Lab Manager Integration Test Manager now enables you to specify the organization to which a user belongs when you configure access to a VMware Lab Manager (Lab Manager) server. Lab Manager uses organizations to determine which resources a user can access. A user with administrator privileges for Lab Manager creates organizations and adds users to organizations. If a user is not assigned to the selected organization in Lab Manager, an error message displays in Test Manager. For more information on the use of organizations in Lab Manager, refer to Lab Manager documentation. Refer to the SilkCentral Administration Module 2009 Help for information on Lab Manager integration with Test Manager.
API Enhancements
The Test Manager API has been enhanced to support additional features. Updating Manual Test Definitions During Upgrade When upgrading to a newer version of SilkCentral Test Manager, you can now update your manual test definitions including all test steps through the Test Manager Web Service API. Refer to the Javadoc for full details regarding available Java classes and methods. To access the Javadoc, click Help Documentation Test Manager API Specification. Web Service Demo Client The Web Service Demo Client is now shipped with Test Manager. Download the client from Test Manager Help Tools. Refer to the Test Manager API Help for more information. New getProductNameById(long sessionId, int productId) Method You can now use the ID of a product to get the name of the product. The new method is located in the sccentities Web Service. Refer to the Javadoc for full details regarding available Java classes and methods. To access the Javadoc, click Help Documentation Test Manager API Specification .
22
Documentation Enhancements
The documentation for Test Manager 2009 is also enhanced. Eclipse Help for Test Manager Help Systems The Installlation, API, Database Model, and Office Import Tool Help systems are now available in Eclipse Help format. You can now view Test Manager Help topics alongside other Borland product Help topics in a single, integrated Eclipse Help browser. Having all Help systems on a common delivery platform greatly improves the consistency of Help topics across the tools and makes it easier to find answers to your questions. Improved Documentation Page The Help Documentation page has been improved: You can now access each Help system as Eclipse Help by clicking the Help system's name. Click the PDF link located to the right of the Help system's name to access the Help as a PDF.
Technology Updates
SilkCentral Test Manager now ships with new versions of third-party software. Microsoft SQL Server 2008 SilkCentral Test Manager now supports Microsoft SQL Server 2008. Be sure to set up case-insensitive SQL 2008 servers, because case-sensitive SQL 2008 servers are not supported. Sunset Microsoft SQL Server 2000 SilkCentral Test Manager no longer supports Microsoft SQL Server 2000. Java SilkCentral Test Manager now ships and runs with Java 1.6.0_13. Microsoft Windows Server 2008 SilkCentral Test Manager now supports Microsoft Windows Server 2008 (32 bit). Microsoft Internet Information Services 7 SilkCentral Test Manager now supports Microsoft Internet Information Services (IIS) 7 as its Web Server. IIS 7 was tested for Windows Server 2008 Enterprise Service Pack 1 (English) and Windows Server 2008 Enterprise Service Pack 2 (English). Related Concepts Synchronizing Requirements Working With SilkPerformer Projects Converting Manual Tests to Automated Tests Execution Definition Schedules Related Procedures Set a Test Plan Node as Integration Default for External Agile Planning Tools Related Reference Execution Runs Tab Current Run Page Execution Notifications Page
23
Tour of the UI
Here is an overview of the main elements of Test Manager's user interface.
Basic UI Structure
Test Managers GUI has four main components:
A: Workflow bar - Facilitates the primary actions related to test management. Click an icon to display the
corresponding test management area in the unit window below. The workflow bar is designed around the natural progression of test management activities, from the establishment of new projects and requirements all the way through to issue management and reporting. See the section below for further details. to Administration functions and Help. This tree can be hidden/displayed by clicking the separator bar along the Navigation Tree's right side. The hide/display setting of the navigation tree is saved for each user account. changes based on the unit you are working in and your activities.
B: Navigation Tree - Provides the same functionality offered by the workflow bar. Additionally offers you access C: Unit window - Shows the functional work area of the currently selected Test Manager unit. This view D: Environmental Info - Displays your user name and the active project. Click
to log out of Test Manager.
Workflow Bar
Test Managers workflow bar gives you quick access to Test Managers core functional units (Projects, Settings, Requirements, Test Plan, Execution, Activities, Code Analysis, and Reports). Buttons for each of these units are available on the workflow bar:
24
Projects
Projects Click Projects to go to the Projects unit, which offers a high-level test-managers view of all projects in your Test Manager installation. The Projects unit enables you to move between projects, see high-level project status details, and view current execution statistics. Settings Click Settings to configure system settings (available functionality varies based on your user role) such as filters, project settings, change notification, and more. Requirements Click Requirements to go to the Requirements unit, which enables you to maintain control over your project's requirements during development: managing the creation, modification, and deletion of requirements; association of test definitions with requirements; change history tracking; and the ability to generate test plans directly from requirement lists. Test Plan Click Test Plan to go to the Test Plan unit, which enables you to create and manage test plans, including the creation of test definitions of both automated tests ( SilkPerformer, SilkTest, JUnit, NUnit & WSH) and manual tests. Execution Click Execution to go to the Execution unit, which enables you to configure test execution definitions, schedule test executions, assign test definitions to test executions, set up execution-definition dependencies, and configure execution-server deployment. Activities Click Activities to go to the Activities tab in the Projects unit. The Activities tab displays recently-executed, current, and upcoming execution definition activity on a per-project basis. Code Analysis Click Code Analysis to go to the Code Analysis unit where you can evaluate the degree to which the code in your AUT (Application Under Test) is covered by test cases. You can then make informed estimates regarding effort/cost and risk associated with specific code changes. Reports Click Reports to go to the Reports unit where you can generate reports with SilkCentral Test Manager, download report templates, edit report parameters, and create new reports based on pre-installed templates. Click in the lower-right corner of the workflow bar to view context-sensitive help for the current page.
Settings Requirements
Test Plan
Execution
Reports
Online Help
Bookmark page Click in the lower-right corner of the workflow bar to bookmark the current Test Manager page. This is especially useful for bookmarking reports, where the current parameters are saved in the bookmarked URL. Print page Click in the lower-right corner of the workflow bar to print any Test Manager page.
25
Related Concepts Getting Started Successful Test Management Related Procedures Logging into Test Manager Managing a Successful Test Related Reference User Interface Reference
26
Help on Help
This topic explains how information is organized in Test Manager Help.
Concepts
The conceptual overviews provide information about product architecture, components, and other information you need to help you work with Test Manager. At the end of most of the overviews, you will find links to related, more detailed information. An Web icon indicates that a link leads to an external Web site.
Procedures
Procedures provide step-by-step instructions. All procedures are located under Procedures in the Content pane of the Help window. Additionally, most of the conceptual overviews provide links to related procedures.
Monospace type Source code and text that you must type. Boldface Italics References to dialog boxes and other user interface elements. Identifiers, such as variables. Italicized text is also used to emphasize new terms. A link to Web resources.
Related Concepts Getting Started Successful Test Management Related Procedures Managing a Successful Test Related Reference User Interface Reference
27
28
29
30
31
32
33
34
35
SilkCentral Architecture
SilkCentral products are based on SilkCentral Architecture (SCA), which allows for common administration of Webbased products. The following sections describe the SilkCentral components.
Overview Front-end server Application server Execution server Chart server Database server
Overview
Front-end server
The front-end server is responsible for the graphical user interface. This server is based on HTML and is accessible from any Web browser, such as Internet Explorer or Firefox. A user sends an appropriate HTTP request to the frontend server and receives a login page for authentication. After successful login, the user can use the corresponding
36
application based on the respective user rights. The front-end server can operate as a stand-alone HTTP server, or it can be attached to a Web server, such as IIS via ISAPI filter.
Application server
The application server synchronizes tasks such as the distribution of schedules, control of execution servers, and management of database configuration. These tasks require a centralized agency to ensure the consistent, reliable behavior of the application. The application server also evaluates results, saves them to the database, and sends alerts based on success conditions.
Execution server
The execution server executes SilkTest and/or SilkPerformer tests that are scheduled by authorized users. Users are responsible for the proper configuration of execution servers and additional resources that are required for test executions. The system allows for the installation and configuration of multiple execution servers working independently of one another.
Chart server
The chart server is used to generate charts that are viewed in reports. The system allows for the configuration of a pool of chart servers. A built-in load balancing mechanism uses the pool to distribute chart generation. The chart server is also used to generate reports and deliver them directly to the end-user for viewing within a browser.
Database server
System persistency is implemented using a RDBMS (Relational Database Management System). SilkCentral supports MS SQL Server 2005 and 2008 (including Express), Oracle 9i (version 9.2.0.8 or later), and Oracle 10g (Borland recommends version 10.2).
Agent computers
SilkPerformer and SilkTest agent computers are assigned to particular SilkPerformer / SilkTest projects from the pool of agent computers that are available to the controller computer. In combination with SilkCentral Test Manager, the controller computer acts as an execution server.
SilkPerformer agents
SilkPerformer agent computers host the virtual users that are run during load tests. As many agent computers as necessary can be added to a SilkPerformer project so that the required quantity of virtual users can be run. Configuration of agents is done through SilkPerformer. See SilkPerformer documentation for details on configuring agents.
SilkTest agents
The same rules that apply to SilkPerformer agents apply to SilkTest agents, except SilkTest agents host SilkTest tests.
37
Related Concepts Getting Started Successful Test Management Related Procedures Managing a Successful Test Related Reference User Interface Reference
38
39
Procedures
This section explains the tasks you must perform before you can begin using Test Manager. In This Section Configuring Browser Settings How to optimize your browser settings for use with Test Manager. Logging In and Out of Test Manager How to log into and out of SilkCentral Test Manager.
40
Select Start
Settings
Control Panel
Internet Options.
On the Internet Properties dialog box, select the General tab, if not already selected. In the Temporary Internet files area (Browsing history area in Microsoft Windows XP) , click Settings. The Settings dialog box displays. In the Check for newer versions of stored pages section, select Automatically. Click OK. Click OK again on the following dialog box.
Note: When running Internet Explorer 6 with Service Pack 1 on a Windows 2003 system, you may experience that Web page contents appear black when you open a dialog box. This is caused by a security feature that was introduced with Internet Explorer Service Pack 1. To remedy this issue, add theTest Manager server to your list of trusted sites:
Select Start
Settings
Control Panel
Internet Options.
On the Internet Properties dialog box, select the Security tab. Select the Trusted sites icon. Click Sites. Enter the URL of your SilkCentral Test Manager host in the Add this Web site to the zone field (for example, http://MyTestManagerHost) Click Close. Click OK to complete the configuration.
Related Concepts Getting Started Successful Test Management Related Procedures Managing a Successful Test Related Reference User Interface Reference
41
42
Navigate to the IP address or URL of your Test Manager installation. Note: Speak to your system administrator about the URL, username, and password you should use to log into SilkCentral Test Manager.
2 3 4
On the Test Manager login page, enter your Username and Password . Check the remember login check box to have Test Manager auto-complete usernames and remember your password when you begin typing your username. Click Login to begin working with Test Manager.
Note: When logging in for the first time, you will be directed to the Project Overview in the Projects unit. Upon subsequent login, you will automatically be directed to the URL you were visiting when you logged out of Test Manager during your previous visit. For example, if when you previously logged out of Test Manager you had a certain test definition selected, you will automatically be directed to that test definition upon login. Related Concepts Getting Started Successful Test Management Related Procedures Managing a Successful Test Related Reference User Interface Reference
43
Click Log Out in the upper-right corner. Your user session will then be terminated.
Related Concepts Getting Started Successful Test Management Related Procedures Managing a Successful Test Related Reference User Interface Reference
44
45
Glossary
General Terminology
Following are definitions of general Test Manager terms. Application server Server that handles all internal processing Assigned test definition Test definition that is assigned to an execution definition for execution, or to a test requirement. Attribute Attributes are used to tag test plan elements. Tags can later be used to filter test plan elements. Component The part of a product that a test plan is created for. Component parameters can be used to categorize test definitions and consequently raise the expressiveness of test plans. Deployment Determination as to where your execution definition should be run. Deployment is realized by assigning an execution server to an execution definition. Execution server The server where your execution definition will be run. Execution servers are defined during deployment. Filter Used to select specific tree elements. Filter criteria may include properties such as attributes and types. Front-end server Server that handles communication between the application and the user through the GUI. Location A physical location that has one or more execution servers where execution definitions can be run. Log A file to which all of a servers activity is recorded for the purposes of diagnostics. Parameter An input type that is required by a test definition. Product An end-product to which your companys efforts are focused. Products typically consist of one or more components. Project A Test Manager entity within which associated efforts are consolidated. All user actions are associated with projects. Defining a project is the first step in test management with Test Manager. Reports Graphic- and table-based documents that present data in a meaningful way. Specific reports are available for requirements management, test planning, and test-execution management. Report server Server that provides report processing and report presentation. Schedule Specifies the times and frequency at which test executions will be run. Toolbar A graphical display of buttons that reflects available functionality. The appearance of the toolbar varies based on the selected workflow unit and tree element. This is not the same as the workflow bar. User Person who works with Test Manager. User types endow users with specific rights. Workflow bar The bar at the top of the page that represents Test Managers core functional units.
46
Execution Terminology
Following are definitions related to test definition executions. Execution definition The basic unit of the Execution tree that has one or more test definitions from the Test Plan unit assigned for execution. Executions may be children of folders or they may be standalone. Execution tree Tree-shaped interface used to organize execution definitions.
47
Folder
An Execution tree structuring element. Folders are used to store related execution definitions.
Related Concepts Getting Started Successful Test Management Related Procedures Managing a Successful Test Related Reference User Interface Reference
48
49
Concepts
This section contains all the conceptual topics associated with using SilkCentral Test Manager. In This Section Successful Test Management This section includes all the conceptual topics that are related to the operation of SilkCentral Test Manager.
50
51
Settings Configuration
This section explains how to configure settings in Test Manager. If you have SuperUser, Administrator, or Project Manager privileges, you can specify project-wide settings for SilkCentral Test Manager projects. Once global settings are defined, they are available to all users who have access to those projects. Global project settings include the definition of filters, attributes, external product integrations, change notifications, and more. If you have SuperUser, Administrator, or Project Manager privileges, you can specify project-wide settings for build information, source files, file extensions, and more. In This Section Global Filters Filters provide an efficient means of finding exactly the information you need, while excluding extraneous detail. Attributes Custom attributes can be used to customize information for test definitions. Custom Requirement Properties You can add custom property fields across all requirements in a selected project. Custom Step Properties You can add custom property fields across all manual test steps in a selected project. Change Notification Test Manager can notify you by email when requirements or test plans are changed by other users. Requirements Integration Configuration External requirements-management integration enables you to coordinate Test Managers requirementsmanagement features with other requirements-management tools. Data Sources for Data-Driven Tests Data-driven tests are tests that are derived from values in an existing data source, such as a spreadsheet or a database. Issue Tracking Profiles Issue tracking profiles enable SilkCentral Test Manager to integrate with external issue tracking systems. Source Control Profiles Source Control profiles enable Test Manager to integrate with external source control systems.
52
Global Filters
Filters provide an efficient means of finding exactly the information you need, while excluding extraneous detail. By defining global filters, you can create complex filter criteria that are available through out Test Manager without requiring you to define filter criteria each time you need to filter a list. Related Concepts Settings Configuration Filters Related Procedures Configuring Projects - Quick Start Task Configuring Test Manager Settings Related Reference Filters tab
53
Attributes
Custom attributes can be used to further customize information for test definitions (in the Test Plan unit). While some attributes are made available by Test Managers integrated functionality, such as priority, components, and platforms, you may want to define custom attributes to categorize test definitions to your needs, or to make test definitions compatible with SilkTest test cases. Related Concepts Settings Configuration Related Procedures Configuring Projects - Quick Start Task Configuring Custom Attributes Configuring Test Manager Settings Related Reference Attributes tab
54
55
56
Change Notification
Test Manager can notify you by email when requirements or test plans are changed by other users. Each user has the option of activating change-notification. So that you are not bombarded with numerous notifications, when a change is made only a single email alert is sent to you regardless of how many changes other users may have made since your last acknowledgment. Email alerts include links that take you directly to a view of recent changes. Before you can activate requirements or test plan change notification, you must configure your email address in Test Managers user settings. Please see SilkCentral Administration Module documentation for details. Note: Change notification only works if an email server has been configured by your administrator. If change notification has not been enabled, please contact your SilkCentral administrator. Note: Once notification has been enabled, you can view and acknowledge changes that have occurred since your last acknowledgment. Related Concepts Settings Configuration Related Procedures Configuring Projects - Quick Start Task Enabling Change Notification Configuring Test Manager Settings Related Reference Notifications Page
57
The Add-In for RequisitePro enhances the RequisitePro menu with an entry providing a link to the Test Manager
front-end servers project selection.
The Add-In for CaliberRM enables CaliberRM with external traceability to Test Manager requirements. The
add-in must be installed on each CaliberRM server and client. the DOORS client on the SilkCentral front-end server.
The Add-In for DOORS enables Test Manager to communicate with DOORS. This add-in must be installed on
Note: When using CaliberRM 2006 or higher, the integration with Test Manager is set up out of the box. It is still recommended to install the add-in that is available from the Help Tools menu in Test Manager to make sure that you have the latest version of the integration installed. Note: Configuring integration with CaliberRM requires the definition of CaliberRM login credentials. Whenever requirements are synchronized between Test Manager and CaliberRM, these credentials are used to login to CaliberRM, thus checking out a CaliberRM license. The license is set free as soon as the synchronization process has completed. We recommend creating a dedicated CaliberRM user for synchronization purposes, which should be used by all Test Manager integration configuration. This ensures that only a single CaliberRM license is used for the process of synchronization.
58
Related Concepts External Requirements Management Tools Settings Configuration Related Procedures Configuring Projects - Quick Start Task Integrating External RM Tools Configuring Test Manager Settings Related Reference APIs
59
60
SilkCentral Issue Manager - see the related Managing SilkCentral Issue Manager Issue Tracking
Profiles topic.
Borland StarTeam - see the related Managing Borland StarTeam Issue Tracking Profiles topic. IBM Rational ClearQuest - see the related Managing IBM Rational ClearQuest Issue Tracking
Profiles topic.
Issue Tracking Web Service - see the Test Manager API Help. Bugzilla - see the see the related Managing Bugzilla Issue Tracking Profiles topic.
Additional issue tracking systems can be configured by installing a custom plug-in (see the Test Manager API Help for detailed information). Defining issue tracking profiles allows you to link test definitions within the Test Plan unit to issues in third-party issue-tracking systems. Linked issue states are updated periodically from the third-party issue tracking system.
61
Related Procedures Managing SilkCentral Issue Manager Issue Tracking Profiles Managing Borland StarTeam Issue Tracking Profiles Managing IBM Rational ClearQuest Issue Tracking Profiles Managing Bugzilla Issue Tracking Profiles Related Reference Issue Tracking Profiles Page
62
Borland StarTeam - see the related Managing Borland StarTeam Source Control Profiles topic. Serena Version Manager (PVCS) - see the related Managing Serena Version Manager (PVCS)
Profiles topic.
Concurrent Version System (CVS) - see the related Managing CVS Profiles topic. Microsoft Visual SourceSafe (MSVSS) - see the related Managing Microsoft Visual SourceSafe
(MSVSS) Profiles topic.
Universal Naming Convention (UNC) (file-system access) - see the related Managing UNC Profiles topic. Subversion - see the related Managing Subversion Profiles topic. Apache Commons Virtual File System (VFS) - see the related Managing VFS Profiles topic.
Additional source control systems can be configured by installing a custom plug-in. Refer to the Test Manager API Help for detailed information.
Borland StarTeam
StarTeam promotes team communication and collaboration through centralized control of all project assets. Protected yet flexible access ensures that team members can work whenever and wherever they like through an extensive choice of Web, desktop, IDE, and command-line clients. StarTeam offers a uniquely comprehensive solution that includes integrated requirements management, change management, defect tracking, file versioning, threaded discussions, and project and task management. Integration is supported for StarTeam version 2005 R2 or higher.
environments, as well as with Microsoft Office applications, MSVSS provides easy-to-use, project-oriented version control. Visual SourceSafe works with any file produced with any development language, authoring tool, or application. Users can work at both the file and project level while promoting file reuse. Integration is supported for MSVSS versions 6 and 2005.
Subversion (SVN)
Subversion (SVN) is the successor to the Concurrent Versions System (CVS). Subversion manages versions using transaction numbers. With each commit, the transaction number is incrementally increased. What other source control systems call labels, Subversion refers to as tags. These tags are encoded in the Subversion URL. For example, http://MyHost/svn/MyApp/trunk is a Subversion URL and http://MyHost/ svn/MyApp/tags/build1012 is a Subversion tag. Test Manager supports Subversion tags. If a Subversion URL contains the trunk directory, you can define a label tags/build1012. This label replaces trunk in the Subversion URL. Note: If a Subversion URL does not contain trunk and you define a label, Test Manager throws an error.
http
ftp smb
Copies the given file. This protocol type is also supported for copying and unpacking ZIP, JAR, or other zipped files. It is required to specify a .zip file on a http server. For Example, zip:http:// myTestServer/myTests.zip. The .zip file will be extracted on the execution server. Copies the given file. This protocol type is also supported for copying and unpacking ZIP, JAR, or other zipped files. Server Message Block (smb) copies all files and folders.
64
Related Procedures Managing Borland StarTeam Source Control Profiles Managing CVS Profiles Managing Microsoft Visual SourceSafe (MSVSS) Profiles Managing Serena Version Manager (PVCS) Profiles Managing UNC Profiles Managing Subversion Profiles Managing VFS Profiles Related Reference Source Control Profiles Page
65
Requirements Management
This section explains how to manage requirements in Test Manager. SilkCentral Test Managers Requirements unit enables you to maintain control over system requirements during development: managing the creation, modification, and deletion of requirements; association of test definitions with requirements; change history tracking; and the ability to generate test plans directly from requirement lists. As with all Test Manager functionality, the Requirements unit is 100% Web enabled and accessible through a Web browser. In This Section Requirements Tree Requirements are displayed, organized, and maintained through a hierarchical tree structure. Attachments You can upload multiple files or links as attachments to requirements. Full Coverage and Direct Coverage Modes Full Coverage mode offers a cumulative view of test-definition-to-requirement coverage. Direct Coverage mode requirement status is calculated only by considering the test definitions that are assigned directly to requirements. Test Coverage Status Shows the status of all tests that have been assigned to the requirement (number and percentage of Passed, Failed, Not Executed, and Not Covered tests). Requirements Reports This section explains the requirements-related reports. Microsoft Office Requirement-Import Tool Assists you in importing requirements from Microsoft Word and Microsoft Excel. Test Plan Generation Test plans can be generated directly from the Requirements tree and test definitions can be assigned to specific requirements. Requirement History Test Manager provides a complete history of all changes that are made to requirements. Change-Notification Emails You can configure email notifications to alert you to changes that are made to requirement settings and/or test-plan settings for specified projects. External Requirements Management Tools This section explains how to work with external requirements management tools.
66
Requirements Tree
Requirements are displayed, organized, and maintained through a hierarchical tree structure, the Requirements tree. Each node in the Requirements tree represents a requirement. Each requirement can have any number of child requirements associated with it. The Requirements tree enables you to organize requirements in any number of hierarchical levels. Note: When the Requirements tree includes more elements than can be displayed at once without impacting response time, elements are displayed in increments. Page number links at the bottom of the tab allow you to browse through the elements included in the tree one page at a time. To display all elements as a single list, select the [All] link. Related Concepts Requirements Management Related Procedures Managing Requirements - Quick Start Task Collapsing or Expanding the Requirements tree Managing Requirements Related Reference Requirements Unit Interface
67
Attachments
You can upload multiple files or links as attachments to requirements. You can edit the descriptions of attachments or delete attachments. When you cut and paste requirements that have attachments, the attachments are automatically included with the copies. The following attachment types are available:
Uploaded Files (.gif, .png, .jpg, .doc, .rtf, .txt, .zip, .xls, .csv, and more) References to UNC paths References to URLs, including StarTeam URLs
Related Concepts Requirements Management Related Procedures Managing Requirements - Quick Start Task Attaching a File to a Requirement Managing Requirements Related Reference Requirement Attachments tab
68
69
70
Requirements Reports
This section explains the requirements-related reports that ship withSilkCentral Test Manager. Requirements reports detail the status of functional requirements (for example, compatibility requirements, GUI requirements, feature requirements) that must be met during development. Requirements may also relate to product management objectives such as reliability, scalability, and performance. Test Managers requirement-management reports help managers determine if adequate test coverage has been established to verify that system requirements are met during development. When a report references a requirement that includes HTML-formatted content, that content is rendered in the report. The following reports come pre-installed with Test Manager. In This Section Status Reports Here are the status reports that are available for Test Manager's Requirements unit. Progress Reports Here are the progress reports that are available for Test Manager's Requirements unit. Document Reports Here are the document reports that are available for Test Manager's Requirements unit. All Related Issues Report Provides a detailed list of all issues related to the assigned test definitions for a requirement.
71
72
73
Requirement History
History Test Manager provides a complete history of all changes that are made to requirements. History information is read-only, and cannot be edited or permanently deleted. The following actions generate requirement history entries:
Adding requirements Editing requirements Marking of requirements as Obsolete Adding Attachments Deleting Attachments Importing/updating requirements through MS Word or MS Excel
The deletion of a requirement using the Destroy permanently option or the deletion of a requirement that has already been marked as obsolete generates a history entry at the project level (on the project node), because the requirement for which the history relates has been deleted from the database. Each requirement-revision entry includes:
Revision number (1-n) Change date/time User who changed the requirement Notes describing the revision Project-Level History
Note: When a requirement has been deleted, or if you acknowledge all recent changes, a change history entry is added to that projects history file. Note: The Recent Changes filter (accessible through on the toolbar) enables you to efficiently view and acknowledge the latest changes and additions that have been made to requirements. Related Concepts Tracking the History of a Requirement Requirements Management Related Procedures Managing Requirements - Quick Start Task Managing Requirements Related Reference Requirement History tab
74
Change-Notification Emails
You can configure email notifications to alert you to changes that are made to requirement settings and/or test-plan settings for specified projects since you last confirmed changes with a change acknowledgement function. Following your logout, an email alert is sent to you each time one of the following settings is changed:
Requirements unit
A requirement is created or deleted. The name or description of a requirement is edited. A system property is edited. A requirement is set as obsolete. A requirement is recovered. A test definition is assigned to or removed from a requirement. A custom property of a requirement is created, edited, or deleted.
75
Related Concepts Tracking the History of a Requirement Requirements Management Related Procedures Managing Requirements - Quick Start Task Enabling Change Notification Managing Requirements Related Reference Requirement History tab
76
77
78
Synchronizing Requirements
Enabling synchronization of requirements between Test Manager and an external requirements management (RMS) tool enables Test Manager to receive changes that occur in the external RMS system whenever a synchronization is executed. If a project has external RMS integration enabled, the master system for requirements is automatically the external system. This means that synchronization is always from the external RMS tool to Test Manager. Requirements can no longer be edited in Test Manager. An exception are newly created requirements that don't exist in the external tool, which are uploaded to the external (master) system only if the option Enable upload of requirements is enabled in Settings Integrations Configuration. Property mapping functionality allows you to map property fields between Test Manager and external requirement tools (for example, a custom field in Test Manager called User might be equivalent to a property field in CaliberRM called Field_2). The property mapping feature ensures that changes to requirement-property fields are accurately refreshed between projects. Requirements can be synchronized in one of several ways:
Manual synchronization: Available through button click on the Properties tab at the root folder level. Automatic scheduled synchronization: Based on globally defined Test Manager schedules. Automatic online synchronization: Changes to requirements are automatically propagated between tools. This
is available for CaliberRM. It requires CaliberRM client installation on the application server and MPX enabled. Requirement data is automatically updated in Test Manager when changes are made in CaliberRM and traces in CaliberRM are updated when test definition assignment changes are performed in Test Manager. This type of online synchronization is only available when projects are configured with the current baseline.
Automatic synchronization of requirements between Test Manager and external requirements management tools can be configured to occur based on global schedules. See SilkCentral Administration Module documentation for details on configuring global schedules. Note: The Open CaliberRM buttons open whatever program is registered as the default program for opening files of extension .crm. On some machines, this may be the requirement viewer, rather than CaliberRM. This behavior can be changed by your administrator. The client program is called caliberrm.exe. When properly configured, the program opens to the requirement that is selected in Test Manager. The binder icon on the project node of the Requirements tree indicates the status of RM integration for the project: No configuration - RM integration is not available. Manual configuration - Requirement import, upload, and synchronization can be done only by clicking the corresponding buttons on the project node in Requirements View (Properties tab). At the project level, the Properties tab includes the following properties:
Status - Whether or not integration has been enabled. Associated With - The external tool with which integration has been enabled. Project Name - The name of the external project that the Test Manager project is associated with. Requirement Types - The requirement types that are shared between projects.
Note: When integration between CaliberRM and Test Manager has been enabled, the project node displays the current status of the online requirements change listener. The three possible statuses for such projects are: Connected (synchronized), Reconnected (synchronization recommended), and Disconnected .
79
Related Concepts External Requirements Management Tools Requirements Management Related Procedures Managing Requirements - Quick Start Task Synchronizing Requirements Across Tools Managing Requirements Related Reference Requirement Properties tab
80
81
82
83
Filtering
This section explains how to filter requirements, test definitions or execution definitions in Test Manager. In This Section Filters Filters enable you to quickly sort through test plan elements and execution definitions, highlighting only those elements that are relevant to your needs. Recent Changes The Recent Changes filter enables you to efficiently view and acknowledge changes and additions that other users have made to requirements, test definitions, or execution definitions project-wide since your last change acknowledgement.
84
Filters
Filters enable you to quickly sort through test plan elements and execution definitions, highlighting only those elements that are relevant to your needs. Based on your needs, you can create new custom filters, edit existing filters, or turn filtering off at the project level. The toolbar includes buttons for creating filters, editing filters, deleting filters, and selecting existing filters. Projects do not contain any default filters. Note: Filters can be accessed and edited from the Test Manager tool bar and Settings unit (through the Settings link on the menu tree). Note: Filters are not applied to reports. The Recent Changes filter enables you to efficiently view and acknowledge changes and additions that other users have made to test definitions project-wide since your last change acknowledgement. In the Test Plan unit, two buttons at the far-right of the toolbar, the Show Changes/Show All toggle button and the Acknowledge button, help you to find out what changes other users have made. Your system administrator can configure email notifications that alert you to changes that are made to test definition settings. Email alerts include links that take you directly to a view of recent changes. Related Concepts Recent Changes Related Procedures Working with Filters Configuring Global Filters Filtering Test Runs on the Activities Page
85
Recent Changes
The Recent Changes filter enables you to efficiently view and acknowledge changes and additions that other users have made to requirements, test definitions, or execution definitions project-wide since your last change acknowledgement. The two buttons at the far-right of the toolbar, the Show Changes/Show All toggle button and the Acknowledge button, help you to find out what changes other users have made. Note: Your system administrator can configure email notifications that alert you to changes that are made to test definition settings. Email alerts include links that take you directly to a view of recent changes. Related Concepts Change-Notification Emails Filters Related Procedures Managing Requirements - Quick Start Task Managing Test Plans - Quick Start Task Viewing Recent Changes Related Reference Requirements Toolbar Functions Test Plan Toolbar Functions
86
87
88
89
90
Data-Driven Tests
Data-driven tests are tests that are derived from values in an existing data source, such as a spreadsheet or a database. Before you can work with data-driven tests, you need to configure a data source. See SilkCentral Administration Module documentation for details on data sources.
Single data-driven test definition instance: A single test definition result is generated for all data rows of your
data source. This means that the test definition is only successful if the execution with every single data row is successful. If the execution with one data row fails, the whole test definition is marked as failed. definition of its own. This means that each data row produces a failed or passed test definition result. For example, if your data source is a spreadsheet with four rows, you will have the original test definition you created (a parent test definition) in addition to four new child definitions, one for each of the data rows.
Multiple data-driven test definition instance: Each data row of your data source is represented by a test
Note: The parent test definition created in this process does not have parameters associated with it, since it only represents a structuring instance for its child test definitions and no longer functions as an actual test definition. All values found in the data source will be listed on the parent test definitions Data Set tab. Note: When assigning a parent test definition to a requirement, note that links to requirements are only inherited when using single data-driven test definition instances. Note: You can not assign the parent test definition of a multiple data-driven test definition instance to a setup or cleanup test execution, as such a parent node is treated as a folder. You can assign one of its child nodes though, and you can also assign a single data-driven test definition instance to a setup or cleanup test execution.
Worksheet Handling
If your data source is a Microsoft Excel worksheet, you should follow these guidelines to ensure a successful and maintainable data-driven test definition setup:
Make sure that your column names are self-speaking. This will allow for easier maintenance in your data source
setup within Test Manager.
If you use multiple worksheets, make sure to use consistent column names across the worksheets. This will
make it easier for you to apply filters for selecting columns for your data source setup.
Keep in mind that youll want to use certain columns as key columns. Key columns will allow you to maintain
your data source file, while Test Manager is still able to identify specific data rows due to the value in the key column, despite changes in row orders. Values within a key column should be unique.
91
data source. For example, if you formatted date cells in an Excel worksheet to display the date in a certain way, Test Manager will ignore this setting and import any date values in the base format "YYYY.MM.DD HH:MM:SS.M". Related Concepts SilkTest Test Definitions Test Plan Management Managing Test Plans Related Procedures Managing Test Plans - Quick Start Task Working with Data-Driven Tests Related Reference Test Plan Data Set tab
92
Success Conditions
One or more success conditions can be assigned to each test node or suite node in the test plan of Test Manager. If a success condition is not met during the execution of a test definition it is assigned to, the test definition execution is marked "failed". For a test package, all success conditions except the execution time-out are disabled and hidden. The success conditions table shows the names of all success conditions that have been configured for a selected test definition. This table can be found at: Test Manager Test Plan Properties <Test Plan Tree Node> A success condition is only evaluated when it is active. To activate and deactivate success conditions see the related Editing Success Conditions procedure. The available types of success conditions differ depending on the test definition type. All currently available success conditions in Test Manager are listed below: Errors Allowed (Active by default) Warnings Allowed Execution Time-Out [s] Page Time: Avg. Page Time [s] Page Time: Max. Page Time [s] Transaction Response Time: Avg. Trans(Busy)ok [s] Maximal number of errors allowed for the test. Maximal number of warnings allowed for the test. Maximal time-out allowed for the test in seconds. Maximal allowed average time to load a page. Maximal allowed maximum time to load a page. Maximal allowed average response time for a transaction in the test. Transaction Response Time: Max. Trans(Busy)ok [s] Maximal allowed maximum response time for a transaction in the test. Inheritance of success conditions is similar to inheritance of properties. Success conditions that are assigned to a parent node are inherited throughout all sub-folders and child test definitions. Related Concepts Test Plan Properties tab Related Procedures Editing Success Conditions
93
Evaluated values:
parameterA = aaa
Parameter Notations
The following parameter notations are supported: For all test definitions:
${<parameter>}
All characters are allowed for parameter names, except $, {, }, and #. Deprecated notation for manual test definitions:
94
#<parameter>#
The following characters are allowed for parameter names: 0-9, a-z, A-Z, and _. Additional notation for SilkTest test definitions:
$<parameter>
The following characters are allowed for parameter names: 0-9, a-z, A-Z, and _.
95
Test Packages
Test packages provide support for the structure of third-party test types in SilkCentral Test Manager, and consist of a package root as well as an arbitrary hierarchy of suite nodes and test nodes. Test packages also provide users with detailed information about a test execution run. Test packages, suite nodes, and test nodes can be individually assigned, along with their issues and attachments, to requirements. This functionality is similar to the functionality of every other test definition. After a third-party test definition is converted into a test package, all tests contained in the package can be run individually. Test nodes and suite nodes contained in a test package are provided with an additional property, the External ID. An advantage of test packages is that the structure can be maintained automatically with every test execution. The structure of a test package can be updated according to the results of its runs. The file <Test Manager installation folder>\wwwroot\silkroot\xsl\output.xsd contains an XML schema for the structure of the output XML files of test packages. Test packages enable all functionalities of the individual tests, with the following exceptions:
Test containers that contain test packages cannot be linked. Test packages cannot be data-driven because they do not possess data-driven properties. All success conditions except the execution time-out are disabled and hidden for test package nodes.
Note: SilkPerformer test definitions, SilkTest test definitions, and manual test definitions can not be converted to test packages, as the structure of these tests is supported in Test Manager by default. Related Concepts Test Plan Management Managing Test Plans Usage of External IDs Related Procedures Managing Test Plans - Quick Start Task Finding and Replacing Test Definition Properties Editing Test Definitions Creating a Test Package Related Reference Test Plan Unit Interface Test Plan Contents Tab Multi-Select Functionality for Test Plan Elements
96
The annotation can be used in a JUnit test to annotate classes and test methods as shown:
import static org.junit.Assert.*; import org.junit.Test; import com.borland.runner.ExternalId; @ExternalId(externalId="JUnit4test") public class JUnit4test { @Test @ExternalId(externalId="MyExtId1") public void test1() { ... } @Test @ExternalId(externalId="MyExtId2") public void test2() { ... } }
Be aware that using External IDs with JUnit runner 'org.junit.runners.Parameterized' is not supported for test methods, because the External ID is not unique for repeated runs of a method with different parameters. As a work around an External ID could be specified on class level, but must be omitted on method level. An example follows:
@RunWith(Parameterized.class) @ExternalId(externalId="parameterizedWithExtId") public class TestCaseParameterizedWithExternalId { @Parameters public static Collection<Object[]> parameterFeeder() { return Arrays.asList(new Object[][] {
97
{ "param_name1", "param_value1" }, // set of parameters per run, type matching constructor must exist! { "param_name3", "param_value3" }, { "param_name2", "param_value2" }, } ); } private String paramName; private String paramValue; public TestCaseParameterizedWithExternalId(String paramName, String paramValue) { this.paramName = paramName; this.paramValue = paramValue; } @Test public void testWithParams() { System.out.println(String.format("run with parameter: name='%s', value='%s'", paramName, paramValue)); } }
Note: The setting of the External ID for a JUnit test is only possible for tests using JUnit 4.4 or higher. Related Concepts Test Plan Management Managing Test Plans Test Packages Related Procedures Managing Test Plans - Quick Start Task Finding and Replacing Test Definition Properties Editing Test Definitions Creating a Test Package Related Reference Test Plan Unit Interface Test Plan Contents Tab Multi-Select Functionality for Test Plan Elements
98
Manual Tests
This section explains manual tests in Test Manager. In This Section Converting Manual Tests to Automated Tests Convert a manual test definition to an automated test of any of the supported automated test types. Using External Tools to Create Manual Tests Test Managers open interface allows you to create manual test definitions outside of Test Managers user interface. Test Definitions in the Manual Testing Client While in Edit mode, the SilkCentral Manual Testing Client offers a full range of test definition editing functionality, including the addition, reordering, and removal of test steps and the insertion of custom step properties.
99
Name Description Assigned requirements Assigned execution definitions Assigned issues Attachments Test steps
Related Concepts Test Plan Management Managing Test Plans Related Procedures Managing Test Plans - Quick Start Task Converting Manual Test Definitions to Automated Tests Related Reference Test Plan Unit Interface
100
101
Parameters
Custom step property parameters can be inserted into test definition and test step descriptions. Parameters can be inserted into the Test Definition Description, Step Description, and Expected Result fields. In normal mode, parameter values are resolved (their parsed values are displayed in place of the parameters themselves). In Edit mode parameters are not resolved; the parameters themselves are displayed. When in Edit mode, using the Parameters list box on the Description tab toolbar, you can select preconfigured Test Manager parameters for insertion.
102
Related Concepts Test Definition Parameters Manual Test Definitions Test Definition Execution Tour of the Manual Testing Client UI Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Editing Test Definitions Within the Manual Testing Client Executing Test Definitions
103
104
Test script - The test script (.t, .g.t) is defined relative to the test containers root node in the source control
profile. This setting is required for all SilkTest test definitions. as data driven, the test case is required.
Testcase - The test case can be selected from a list box or entered manually. If the test definition is not defined
If the custom test case field is already populated, the SilkTest test definition was automatically created (using the export functionality within SilkTest). If the custom field is used for specifying the test case, the test case name can be terminated by parenthesis "()". In between the parenthesis, test data may be specified (defined test data can also include parameters). Please note that this will override the values of the Test data property (see below).
Test data - Specification of test data is optional. If several arguments are passed to SilkTest, they have to be
separated by a comma (,). If a String argument is passed to SilkTest, the argument must be set in quotation marks ().When test data is more complex, its recommended that you use parameters in the test data, e.g., $ {ParameterName}. Parameters are replaced automatically within test definition executions.
Data driven - When a SilkTest test requires input data from an external datasource, this flag must be enabled.
Default execution mode for data-driven tests is plan-based. If script-based execution mode is to be used for a data driven test, change the DataDrivenScriptMode setting in the SilkTest element of SccExecServerBootConf.xml. option set files. To specify an option set file, specify the file name relative to the test containers root node in the source control profile.
Option set - Specification of an option set file is optional. By default, Test Manager closes all open SilkTest
105
Related Concepts Test Plan Management Managing Test Plans Related Procedures Managing Test Plans - Quick Start Task Creating Test Definitions Editing SilkPerformer Tests Adding Test Containers Related Reference Test Plan Unit Interface
106
Test Definitions
This section explains certain test definition types and Test Manager's Upload Manager. In This Section Upload Manager SilkCentrals Upload Manager offers a convenient means of uploading files (typically test scripts) to the SilkCentral file pool. Windows Script Host Tests Windows Script Host (WSH) is part of the Windows platform and creates an environment for hosting scripts.
107
Upload Manager
SilkCentrals Upload Manager offers a convenient means of uploading files (typically test scripts) to the SilkCentral file pool where they are accessible to Test Manager. For information about uploading files to Issue Manager, please see the SilkCentral Issue Manager Help. With Issue Managers integration with TechSmiths SnagIt screen capture utility, Issue Manager users can easily capture screen images of error conditions and upload them to Issue Manager where they can be attached to existing issues or serve as the basis for new issues. Upload Manager can be accessed in one of two ways:
By starting its executable from the program directory to which it has been installed (for example, C:\Program
Files\Borland\SC Test Manager <version>\Upload Manager\UploadManager.exe).
(Issue Manager users only) Using a hotkey keyboard combination (for example, Ctrl+Print Screen). This
option is enabled when the SnagIt screen capture utility is configured for use with Issue Manager. See Issue Manager documentation for details.
Note: For information about uploading files to Issue Manager, please see Issue Manager documentation.
Commandline Options
Upload Managers settings can be defined through command line options. Defining command line options optimizes SnagIts integration with SilkCentral by automating formatting and configuration operations that you would otherwise have to execute manually with each file upload. Note: Settings configured through command line options override any conflicting settings that may have been manually configured using the SnagIt GUI. Here is the format that is to be followed when constructing commandline options: UploadManager.exe {options} {UploadFileNameList} The order in which option settings are listed does not affect application behavior. Option parsing is not case-sensitive. Option parameters must not be separated by blank spaces. The following command line options are supported by Upload Manager: -HOSTNAME:<HostName> For example, -HOSTNAME:tm.borland.com The target server. Notice that the server name is not preceded by a protocol. For example, -USERNAME:admin The user name to be used for server login. For example, -PASSWORD:secret The password to be used for server login. For example, -PORT:19120 The targeted server port. is . The default http:// port is 80. The default https:// port is 443. The default test server port is 19120. For example, -SECURE:1 For https://myhost:myport/ connections: - 0 sets a standard HTTP connection. - 1 sets a secure HTTPS connection.
-SECURE:<0 or 1>
108
-CONFIG: <ConfigurationId>
For example, CONFIG:2 The configuration ID of the targeted server, used for uploading files to Issue Manager and SilkCentral: - 2 is used for uploading files to Issue Manager.
-PROJECT:<ProjectId>
- 3 is used for uploading files to the SilkCentral server file pool. For example, -PROJECT:0 0 is the default demo project in Issue Manager. To identify an Issue Manager projects ID#, pass your cursor over an active project name on the Issue Manager Projects page. Then look for the following string in your browsers status bar: imPrj=<project ID #>. For example, DEFID:156
-DEFID:<DefectId>
ID of a specific issue in Issue Manager to which a file is to be attached (in this case, issue #156). 0 is used to create a new issue. -DESC:<AttachmentFileDescription> For example, -DESC:Screen Shot Attachment Description of the attached file (Issue Manager only). Checks the Upload Manager dialog check box that instructs Upload Manager to close after it successfully completes an upload. -AUTOUPLOAD:<WizardStepIndex> For example, -AUTOUPLOAD:10 -AUTOCLOSE Instructs Upload Manager to attempt automatic file upload up through a specified wizard step (WizardStepIndex). Upload Manager then prompts the user for manual input at the following wizard step. Automatic file upload will stop at any step within which a configuration failure is detected. This option causes a dialog box to be displayed if a dependent component is missing (for example, if a SilkPerformer compiler is unavailable when a SilkPerformer project is to be uploaded to SilkCentral). For example, C:\TEMP\MyScreenShot.jpg Paths can be used when uploading multiple files to the SilkCentral file pool. Any parameter passed to Upload Manager that is not preceded by a dash (-) is recognized as an absolute file path.
-VERBOSE
<UploadFileNameList>
Related Concepts Test Plan Management Managing Test Plans Related Procedures Managing Test Plans - Quick Start Task Using Upload Manager Related Reference Test Plan Unit Interface
109
cscript <somescript> ...where <somescript> is the path to a script of your choice that is available on your execution server.
This is exactly what SilkCentral Test Manager will call when executing a WSH test definition on an execution server. If the script is executed, then the scripting engine has been registered successfully. These are the scripting languages that are known to be WSH compatible:
Perl (extension .pls) Python (extensions .py, .pyw) REXX TCL (extension .tcl)
110
Switches
For the Switches test definition property, the following settings can be entered and passed to cscript.exe during test definition execution: //B Batch mode suppresses all non-command-line console UI requests from the script. It is recommended that you use this option to prevent a script from waiting for user input during unattended executions at the execution server. Unicode is used for redirected I/O from the console (recommended). Time-out, in seconds. Maximum time the script can run (default = no limit). This option is used to prevent excessive execution of scripts; it sets a timer. When execution time exceeds the specified value, Cscript interrupts the script engine using the IActiveScript::InterruptThread method and terminates the process.
//U //T:nn
There is a callback hook. If the time-out is invoked, the OnTimeOut function is called to permit cleanup. Although it is possible to create infinite loops using this feature, it is more useful than harmful. //logo Displays an execution banner at execution time that is visible at the beginning of the log.txt log file. This is the default setting. //nologo Prevents display of the execution banner at execution time. //D Enables active debugging. //E:engine Use the engine to execute a script. //Job:xxxx Execute a WSF job. //X Execute the script in debugger.
111
Log Information
Any information that a script writes to the WSH standard output goes into the log.txt text file that resides in the current working directory. This file is stored in the database and can be viewed as it is included in the file list of the test definition execution. Example for printing log information:
Structure of Output.xml
The XML structure begins with an element ResultElement that defines an attribute named TestItem, which specifies the name of the ResultElement. The ResultElement must contain an element named ErrorCount, optionally an element named WarningCount, and a list of Incident elements. The ErrorCount and WarningCount elements must contain a positive number or zero. In SilkCentral Test Manager, the ErrorCount and WarningCount of the top-level* ResultElement are used for evaluating success conditions, which determines if a test has passed or failed. The Incident element represents an event that happened during the execution of the WSH test. Message and Severity are shown in the messages list of test definition executions in SilkCentral Test Managers GUI. An Incident element must contain a Message and a Severity element. The Severity element must hold one of following values:
FailureCount (treated the same way as error count) RunCount (if a test is run multiple times) Timer (e.g., for duration of the test)
112
WasSuccess (compatibility with NUnit result files) Asserts (compatibility with NUnit result files)
The Incident element may contain a list of Detail elements. The Detail element represents detailed information about an Incident. It must define a TestName element and an Info element. The TestName is used to give detailed information about where the Incident happened. The Info element holds detailed information about the Incident (e.g., a stack trace). Note: Up through SilkCentral Test Manager 8.1, the value of the Info element had to be URL encoded (ISO-8859-1). Since version 8.1.1, URL encoding is no longer allowed.
function dumpOutput(dumpFile) { dumpFile.WriteLine("<ResultElement TestItem=\"WshOutputTest\">"); dumpFile.WriteLine(" <ErrorCount>1</ErrorCount>"); dumpFile.WriteLine(" <WarningCount>1</WarningCount>"); dumpFile.WriteLine(" <Incident>"); dumpFile.WriteLine(" <Message>some unexpected result</Message>"); dumpFile.WriteLine(" <Severity>Error</Severity>"); dumpFile.WriteLine(" <Detail>"); dumpFile.WriteLine(" <TestName>function main()</TestName>"); dumpFile.WriteLine(" <Info>some additional info; eg. stacktrace</Info>"); dumpFile.WriteLine(" </Detail>");
113
dumpFile.WriteLine(" </Incident>"); dumpFile.WriteLine(" <Incident>"); dumpFile.WriteLine(" <Message>some warning message</Message>"); dumpFile.WriteLine(" <Severity>Warning</Severity>"); dumpFile.WriteLine(" <Detail>"); dumpFile.WriteLine(" <TestName>function main()</TestName>"); dumpFile.WriteLine(" <Info>some additional info; eg. stacktrace</Info>"); dumpFile.WriteLine(" </Detail>"); dumpFile.WriteLine(" </Incident>"); dumpFile.WriteLine("</ResultElement>"); } function main() { var outFile; var fso; fso = WScript.CreateObject("Scripting.FileSystemObject"); outFile = fso.CreateTextFile("output.xml", true, true); outFile.WriteLine("<?xml version=\"1.0\" encoding=\"UTF-16\"?>"); dumpOutput(outFile); outFile.Close(); WScript.Echo("Test is completed"); } main(); WScript.Quit(0);
WScript.Echo "starting" Dim outFile Dim errCnt Dim warningCnt outFile = "output.xml" errCnt = 1 ' retrieve that from your test results warningCnt = 1 ' retrieve that from your test results Set FSO = CreateObject("Scripting.FileSystemObject") Set oTX = FSO.OpenTextFile(outFile, 2, True, -1) ' args: file, 8=append/2=overwrite, create, ASCII oTX.WriteLine("<?xml version=""1.0"" encoding=""UTF-16""?>") oTX.WriteLine("<ResultElement TestItem=""PerlTest"">") oTX.WriteLine(" <ErrorCount>" & errCnt & "</ErrorCount>") oTX.WriteLine(" <WarningCount>" & warningCnt & "</WarningCount>") oTX.WriteLine(" <Incident>") oTX.WriteLine(" <Message>some unexpected result</Message>") oTX.WriteLine(" <Severity>Error</Severity>") oTX.WriteLine(" <Detail>") oTX.WriteLine(" <TestName>function main()</TestName>") oTX.WriteLine(" <Info>some additional info; eg. stacktrace</Info>")
114
oTX.WriteLine(" </Detail>") oTX.WriteLine(" </Incident>") oTX.WriteLine(" <Incident>") oTX.WriteLine(" <Message>some warning message</Message>") oTX.WriteLine(" <Severity>Warning</Severity>") oTX.WriteLine(" <Detail>") oTX.WriteLine(" <TestName>function main()</TestName>") oTX.WriteLine(" <Info>some additional info; eg. stacktrace</Info>") oTX.WriteLine(" </Detail>") oTX.WriteLine(" </Incident>") oTX.WriteLine("</ResultElement>")
Related Concepts Test Plan Management Test Definition Parameters Managing Test Plans Related Procedures Managing Test Plans - Quick Start Task Creating Test Definitions Editing Windows Scripting Host Tests Adding Test Containers Related Reference Test Plan Unit Interface
115
116
117
118
119
Execution Definitions
Execution definitions are collections of assigned test definitions that are stored in a single test container. Execution definitions can be run at configurable schedules and deployed on specified execution servers. The process of adding and editing execution definitions is the same for both automated execution definitions and manual execution definitions.
assuming the settings of the execution environments are the same. When an execution-definition's keywords match multiple virtual execution servers, the first matching virtual execution server that is identified is selected.
Folder Execution
Execution definitions can be combined into execution folders, where a folder can include execution subfolders and execution definitions. The options for an execution definition execution are also available for an execution folder execution. When executing a folder, the contained subfolders and execution definitions are treated as follows:
The Relation of Execution and Keywords
Keywords of Executed Folder Keywords of Contained Execution Definition/Subfolder Execution of Contained Execution Definition/ Subfolder
Execution definitions without keywords obtain status NOT EXECUTED after execution Execution servers are assigned based on the execution definition/subfolder keywords Execution servers are assigned based on the folder keywords Execution servers are assigned based on the folder keywords
Note: When a folder is executed manually and there are no keywords assigned, or no execution server exists for the assigned keywords, the default execution server is used for execution. If the default execution server is not available, these execution definitions are marked as "Not Executed".
A test definition gets its status from the result of the latest test execution run. If you manually change the status of the latest test execution run, the test definition status changes also.
121
Note: If the latest test execution run is deleted, the status of the test definition resets to the status of the latest existing test execution run. Only test execution runs with status Passed or Failed are used to reset the test definition status, test execution runs with status Not Executed are ignored. Note: If the deleted test execution run was the only existing test execution run, the status of the test definition is set to Not Scheduled, as if the test definition was newly created. Related Concepts Test Definition Execution Execution Definition Schedules Related Procedures Managing Test Executions - Quick Start Task Adding Execution Definitions Assigning Keywords to Execution Definitions Executing Test Definitions Related Reference Execution Unit Interface
122
Execution Definition Name Execution Definition ID Execution Definition Run ID Start Time Duration Execution Server Warnings/Errors Status Version/Build SilkTest AUT Host Name Setup Test Definition
Name of the execution definition. Unique identifier of the execution definition. Identifier of the excution definition run. Time the run was started. Duration of the run. Execution server assigned to the execution definition. Amount of warnings and errors generated during the run. Status of the execution definition after the run. Version and build of the product specified for the run. Name of the SilkTest AUT (Application Under Test) Host. Test definition that prepared the testing environment in anticipation of the test. Click on the name of the test definition to view or edit it. Click on the ID of the test definition run to open the Test Definition Run Results dialog box. Test definition that restored the testing environment to its original state following the test. Click on the name of the test definition to view or edit it. Click on the ID of the test definition run to open the Test Definition Run Results dialog box.
The Execution Definition Run Results dialog provides additional information about the files included and the messages generated during the execution definition run. It also lists all the assigned test definitions for the execution definition. For manual tests click Manual Test Results to get a read-only version of the current runs page, with detailed information on the manual test. Uncheck the Hide passed test definition runs check box to show all test definitions. The Hide passed test definition runs check box is checked by default to show only the not passed test definitions. The Assigned Test Definitions section lists all test definitions that are assigned to this execution definition. Click on the name of a test definition to view or edit it, or click on the Run ID of a test definition to open the Test Definition Run Results dialog box. Related Concepts Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Creating Test Definitions Working with Data-Driven Tests Executing Test Definitions Related Reference Test Definition Run Results Dialog Execution Runs Tab Current Run Page
123
No schedule (None) Use a pre-defined schedule (Global) Define a custom schedule (Custom)
Note: Schedules can be defined for entire folders as well as individual execution definitions. If a schedule is defined for a folder, all execution definitions that are included in the selected folder will be executed at the specified schedule. Execution definitions with no keywords assigned get the status "Not Executed" when executed in a schedule.
Schedule Exclusions
Exclusions enable you to define weekdays and time-of-day intervals during which test definitions are not to be executed, regardless of configured schedules. For example, you may not want tests to be executed on weekends.
Definite Runs
Definite runs enable you to define times at which test definitions will be executed regardless of configured schedules. Related Concepts Test Definition Execution Execution Definitions Related Procedures Managing Test Executions - Quick Start Task Creating a Custom Schedule for an Execution Definition Adding Definite Runs Adding Exclusions Adding Execution Definitions Executing Test Definitions Related Reference Execution Schedule tab
124
125
Not Executed Passed Failed Unsupported Unresolved In Progress (only test definitions)
The calculation of the test definition status is based on the following rules, where the rules with lower numbers override the rules with higher numbers:
1 2 3 4
If the status of the test definition is Not Executed, and you change the status of a test step, the test definition status is set to In Progress. As long as there is at least one step with status Not Executed, the test definition status remains In Progress. If there is at least one step with status Failed or Unresolved, the test definition status is set to Failed. If the status of every test step is Passed or Unsupported, the test definition status is set to Passed.
Related Procedures Executing Manual Tests in the Current Run Page Related Reference Current Run Page
126
127
128
Basic UI Structure
Manual Testing Client's GUI features eight main areas:
Menu Bar
The menu bar provides editing options for test definitions and views and access to Help. You can use the menu bar to edit user, appearance, or test settings, to navigate between execution packages, or to access Help. The available menus are listed in the following:
129
Menu
Description
File
Edit
Provides the same functionality as the workflow bar and additional functionality for package import and export, changing users, switching the Manual Testing Client between online and offline mode, and storing executed package results to Test Manager. Set the execution status of an execution package or test definition, add a result file to a test package or test definition, set the build number for an execution package, or eidt the code analysis settings of an execution package.
Item Description
Set the execution status of the execution package or test definition to the selected status. Add Result File... Add a result file to an execution package or test definition. Edit Build Number... Select the build number for an execution package. Edit Code Analysis Settings... Specify hostnames to include in code coverage runs for an execution package. The hostnames must be separated by commas. For example: labmachine1, 192.168.0.1:19129. Delete Delete an execution package. You can delete only execution packages with completed runs. Go Navigate between the available execution packages and test definitions. Window Edit the view and the preferences of the Manual Testing Client.
Item Description
Set as <Status>
Help
Activate and deactivate the available views in Manual Testing Client's Workspace. Reset Perspective... Reset the current Manual Testing Client perspective to the default state. Preferences.... Set the preferences for the Manual Testing Client. Access the documentation and About page of the Manual Testing Client.
Show View
Workflow Bar
The workflow bar gives you quick access to the basic functions you can perform with the Manual Testing Client. The following buttons are available in the workflow bar:
Button Description
Download the execution packages from SilkCentral Test Manager. Execute a test in the Manual Testing Client. Stop the execution of a test before the test is fully executed. Upload the execution packages to SilkCentral Test Manager.
Inbox
The Inbox lists all execution packages that have been downloaded to the Manual Testing Client for manual execution. Multiple execution packages can be selected within the Inbox window using standard Windows keyboard shortcuts. The Inbox includes the following properties for the selected test definition: The Inbox includes the following properties for the selected test definition:
130
Property
Description
Package Name Name of the execution package that has been downloaded from Test Manager. ID Execution package number that has been generated for this execution package. Status Status of the execution package (available values include Not Executed, Passed, Failed, Unresolved, and Unsupported). Execution package status values can not be edited. Overall package status is determined by the composition of the test definition statuses (and, by extension, the statuses of the test steps) that are contained within a package. For example, overall package status remains Not Executed as long as a test step of one of the contained test definitions has a status of Failed. So, even if some of a test definition's steps have passed, the overall status of the package will remain Not Executed until the package is finished and all unexecuted test definitions are assigned a status on the Finish Run dialog. The overall status will be considered Passed when one or more test steps or test definitions in a package are Passed and all unexecuted steps and test definitions are resolved through the Finish Run dialog. Priority Priority of the execution package. Keywords All keywords that are assigned to the execution package. Started At When testing of the execution package began. Project Name of the Test Manager project from which the execution package was derived. Version Product version from which the execution package was derived. Build Product build from which the execution package was derived. Execution Path File path where this execution definition resides in Test Manager's Executions Tree.
Completed Runs
The Completed Runs tab lists all execution packages for which testing is complete. The Completed Runs tab includes the following properties for the selected execution package:
Property Description
This column offers a status icon that indicates the upload status of corresponding execution packages. A red arrow icon indicates that a package's results have not yet been uploaded. A checkmark icon over a faint arrow icon indicates that a package's results have already been uploaded to the server. Double-click this icon to open the Execute Test dialog for the first test definition of this execution package. Package Name Name of the execution package that has been downloaded from Test Manager. ID Execution package number that has been generated for this execution package. Status Status of the execution package. Status values can not be edited on the Completed Runs tab. Overall package status is determined by the composition of the test definition statuses (and, by extension, the statuses of the test steps) that are contained within a package. For example, overall package status remains Not Executed as long as a test step of one of the contained test definitions has a status of Failed. So, even if some of a test definition's steps have passed, the overall status of the package will remain Not Executed until the package is finished and all unexecuted test definitions are assigned a status on the Finish Run dialog. The overall status will be considered Passed when one or more test steps or test definitions in a package are Passed and all unexecuted steps and test definitions are resolved via the Finish Run dialog. Priority of the execution package. All keywords that are assigned to the execution package. When testing of the execution package began. When testing of the execution package ended. Name of the Test Manager project from which the execution package was derived. Product version from which the execution package was derived.
131
Build Product build from which the execution package was derived. Execution Path File path where the execution definition resides.
Test Definitions
The Test Definitions tab includes all information related to the manual test definition that is selected above in the Inbox or Completed Runs tab. Multiple test definitions can be selected within the Test Definitions window using standard Windows keyboard shortcuts. To apply a status change to selected test definitions, right-click the selection and select a new status value from the context menu. The Test Definitions tab includes the following properties for each test definition:
Property Description
# Name Status
Number that has been automatically generated for the test definition. Test definition name. Status of the test definition (available values include Not Executed, Passed, Failed, Unresolved, and Unsupported). This value can be changed by right-clicking the current value and selecting an alternative value from the context menu. Last Status Status that this test definition held before the current status. Steps Number of steps in the selected manual test definition. Planned Time Estimated time for completion of the test in [hh:mm:ss]. Used Time This field tracks elapsed time (in [hh:mm:ss]) since the start of the test execution. This field can be manually edited (the timer will stop during editing). After editing this field the timer will continue tracking time from the manually adjusted time. Test Definition Path File path where this test definition resides in Test Manager's Test Plan Tree.
Attachments
The Attachments tab lists any attachments related to the selected manual test definition. This tab is also available on the Execute Test dialog. When you have selected a test definition in the Test Definitions window, you have the option of supplementing the list of displayed attachments by selecting an Include attachments of value. Select Test Container/Folders to include all attachments from the selected test definition's test container or folder. Or select Test Steps to include attachments from the test steps of the test definition. The Attachments tab includes the following properties for each attachment:
Property Description
Name of the attachment. Attachment file type. Description that has been created for the attachment (if any). File path where this attachment's test definition resides in Test Manager's Test Plan Tree. If the attachment is an image, you can use the Image Preview controls to view the attachment. Right-click the image, or click the buttons to the right of the window, to access the following commands: Show Actual Size, Scale to Fit, and Scale to Fit Keep Aspect Ratio. Click Open as Detached Window to open Image Preview in a separate window.
132
Result Files
The Result Files tab lists any result files that are related to the selected manual test definition. This tab is also available on the Execute Test dialog. The Result Files tab includes the following properties for each result file: Name Source Add File Paste Image Remove Image Preview Name of the result file. File path where this result file's test definition resides in Test Manager's Test Plan Tree. Click to browse to and select a new result file for upload to this test definition. Click to paste an image from your computer's clipboard and attach the image to this test definition. Click to remove the selected result file attachment from this test definition. If the result file is an image, you can use the Image Preview controls to view the result file. Rightclick the image, or use the buttons to the right of the window, to access the following commands: Show Actual Size, Scale to Fit, and Scale to Fit Keep Aspect Ratio. Click Open as Detached Window to open Image Preview in a separate window.
Issues
The Issues tab lists any issues related to the selected manual test definition. This tab is also available on the Execute Test dialog. The Issues tab includes the following properties of each issue: Issue ID Synopsis Status External ID ID that has been assigned to this issue. Synopsis that has been written for this issue. Status of the issue. Indicates if the issue is tracked by an external issue tracking system. If this issue is tracked by an external issue tracking system, and that issue has been assigned an ID, you can click the external ID number in this field to link directly to the issue in the external issue tracking system. Created On When the issue was created. Created By User who created the issue.
Outline
Shows the content tree of the selected execution package or the location of the selected test definition in the execution package.
Description
Shows the description of the selected execution package or test definition.
Status Bar
The status bar shows the current amount and status of the execution packages and test definitions in the currently active view.
Button Description
Online/Offline Click to switch Manual Testing Client's mode from online to offline and back.
133
Related Concepts Manual Testing Client Test Definition Parameters Test Definitions in the Manual Testing Client Related Procedures Using the Manual Testing Client Editing Test Definitions Within the Manual Testing Client Adding an Internal Issue with the Manual Testing Client Related Reference Execute Test Dialog Box
134
Related Concepts Manual Test Definitions Test Definition Execution Execution Definitions Tour of the Manual Testing Client UI Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Enabling Code Analysis for SilkCentral Test Manager Using the Manual Testing Client Working with Manual Tests Executing Test Definitions Related Reference Current Run Page
135
SilkTest Tests
This section explains how to execute test definitions in SilkTest. In This Section SilkTest Logs RMS log files are used to log data for each test case as test runs progress. SilkTest Time-out Settings Information about setting SilkTest time-out settings. Automated Execution of Data-Driven SilkTest Testcases Execution mode options for data-driven SilkTestTestcases. Automated Execution of SilkTest Test Definitions Information about automatic execution of SilkTest tests. Specifying Agent Under Test (AUT) When a SilkTest agent cannot run on the same machine as the Test Manager execution server, the hostname and port should be specified.
136
SilkTest Logs
SilkTests RMS log file is used to log data for each test case as test runs progress. Three types of data records are written to this file: status, memory and user records. By monitoring this file, the RMS Remote Agent has a means of determining the progress of each test run. You can write your own comments into the user records of the log file by executing the PrintToRMSLog 4Test function. Examples: PrintToRMSLog ("Error", "An intended error"). PrintToRMSLog ("Info", "testcase sleep1 started") PrintToRMSLog ("Warning", "TestCase 1 started a second time") Definition of user function in rms.inc: PrintToRMSLog (STRING sMessageType, STRING sUserMessage) writes to the log file in the following format: U|{sTestCaseName}|{sScriptName}|{sArgStr}|{sUserMessage}|{sMessageType} Related Concepts SilkTest Tests Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Executing Test Definitions Related Reference Execution Unit Interface
137
138
139
140
141
Issue Management
The Issues unit helps you track the issues that are associated with the selected project. You can work with a detailed tabular listing of statistics ( Document View ) or a chart view ( Issues View ). Issues from both internally and externally configured issue databases are tracked. Statistics can be reported individually or cumulatively across tracking systems. Out-of-the-box support is offered for the following issue-tracking systems: SilkCentral Issue Manager, Borland StarTeam, and IBM Rational ClearQuest. See SilkCentral Administration Module documentation for details on setting up these systems. Other external issue-tracking systems can be integrated through Test Managers Java API and Web Service interface. Refer to the Test Manager API Help for details. Note: Issues are also tracked on the test-definition and test-container level in the Test Plan unit. New issues can be entered and associated with test definitions in the Test Plan Unit, Issues tab. New issues can also be entered from the Activities tab (Projects Unit).
Document View
Test Managers Document View offers an overview of the states of all project-related issues in the form of an issuestate statistics table. The Issues tree displays all issue-tracking systems and associated Test Manager products that have been configured for those systems. The internal tracking system that has been configured for Test Manager is called Internal. Note: Products are configured through Test Managers Administration unit (Administration/Configuration/Products). See the SilkCentral Administration Module Help for information on configuring products. Current status statistics for the currently selected tree node are shown in the table on the right. The Date column shows the date/time of recent updates. Each row in the table shows the number of issues that have each columns respective status. For the project node and issue-tracking system nodes, statistics are accumulated values of the respective child nodes in the tree. Statistics are calculated by fetching information from the issue tracking systems. This function is performed periodically by the SilkCentral application server. By default, this occurs once each hour. The interval can be customized in TMAppServerHomeConf.xml by setting the minutes value in IssueStateUpdate/UpdateInterval. The application server must be restarted to activate changes and initiate countdown for the first run.
Issues View
Issues View provides historical information for issues in a chart view. The view reflects the status values that were retrieved from the tracking systems (both external and the internal systems) each day. If the product node of the Issues tree is selected, all statistics for all issue-tracking systems and all products will be retrieved. When a product is selected in the Issues tree, statistics for only that product are retrieved. When a specific issue-tracking system is selected, statistics for only that system are displayed. Issues View includes:
Calendar - Enables you to define the time frame across which statistics are to be calculated. Chart - Tracks issue status counts over the specified time frame.
142
Table - Shows the values reflected by the chart (for the past five days of the selected time frame only).
Note: The Issues tree displays only those external issue-tracking systems and products that have at least one issue assigned to them. The internal tracking system is always displayed. Related Concepts SilkCentral Issue Manager Test Definition Execution Upload Manager Related Procedures Managing Test Executions - Quick Start Task Tracking Issues Working with Issues Executing Test Definitions Related Reference Issues Unit Interface Execution Unit Interface Calendar Tool
143
Project Management
This section explains how to manage projects in Test Manager. The Projects unit offers a high-level test-managers view of all projects in your Test Manager installation. The Projects unit enables you to move between projects, see high-level project status details, and view current execution statistics. In This Section Build Information Build information files contain version and build information that is used for execution runs. Build Information Updates Whenever a new build becomes available for testing on an execution server, build information must be updated.
144
Build Information
Build information files, which contain version and build information used for execution runs, are typically stored and searched for on the execution server that is executing a run. When a build information file is not found there, then the file is searched for on the application server. This behavior is beneficial when you have several execution servers and need to use a single build information file across all execution servers. You only need to maintain a single build information file on the application server. Similarly, when no execution server has been assigned to a test definition (and also for manual tests), build information files are searched for on the application server. Test Manager is able to match up test results with build information and display test results for specific build numbers. Related Concepts Build Information Updates Successful Test Management Related Procedures Managing a Successful Test Related Reference Projects Unit Interface
145
Manually, by editing the file(s) each time a new build is installed. Automatically, if you are using an automated build update process to update the build information file (for
example, through VB Script).
Related Concepts Build Information Successful Test Management Related Procedures Managing a Successful Test Related Reference Projects Unit Interface
146
Report Generation
This section explains how to generate and view SilkCentral Test Manager reports. In This Section New Report Creation This section explains how to create new reports with SilkCentral Test Manager. Context-Sensitive Reports The Requirements, Test Plan, and Execution units offer dynamically-generated lists of reports that are specific to each unit. Project Overview Report The Project Overview Report contains a high-level overview of the status of the selected project. Test Manager 8.0 Reports Any reports created for a Test Manager 8.0 installation will appear in the Reports unit. Requirements Reports This section explains the requirements-related reports. Test Plan Reports This section explains the test-plan reports that ship with SilkCentral Test Manager. Execution Reports This section explains the execution reports that ship with SilkCentral Test Manager. Code Coverage Reports This section explains the code coverage reports that ship with SilkCentral Test Manager. Performance Trend Reports This section explains the performance trend reports that ship with SilkCentral Test Manager. Issues Per Component Report Test Manager offers one issues-related report. Code-Change Impact Reports Test Managers code-change impact reports enable you to perform testing-impact analysis, effort analysis, and risk analysis.
147
148
New Reports
This topic explains how to create new reports with Test Manager, edit report parameters, and create new reports based on pre-installed templates. Test Manager offers reports that quickly and easily transform data into intuitive charts and graphs. Pre-installed reports are available for Test Managers Requirements, Test Plan, and Issues units. Reports are created using either BIRT RCP Designer, an open-source, Eclipse-based report tool, or Microsoft Excel report templates. SilkCentral Test Manager is tightly integrated with BIRT RCP Designer to make it easy for you to generate reports on test management data. Test Managers reporting functionality is highly customizable. Numerous pre-installed reports and report templates provide out-of-the-box options for a wide range of reporting needs. Simple GUI-based tools allow you to edit Test Managers pre-installed reports and create reports of your own. For users with SQL knowledge, there is virtually no limit to how data can be queried and presented in custom reports. Note: For information about editing report templates and creating custom report templates using BIRT RCP Designer and MS Excel, see the SilkCentral Administration Module Help. Tip: If a blank report is generated, the cause may be that there are not any data in the project you selected, or you may not be connected to the appropriate SilkCentral Test Manager database. Tip: Reports are not available offline unless your SilkCentral Test Manager database is accessible locally.
Sample Report
Below is the code of a pre-installed report called 'All Requirements'. This report has not undergone editing using Test Manager's GUI-based tools or SQL. By default, this report displays all properties of all requirements in the selected project, except those requirements that have been identified as obsolete. Obsolete requirements are filtered out by the reports reqProp_Obsolete_0 parameter.
SELECT r.ReqID, r.ReqParentID, r.PositionNumber, r.ProjectID, r.ProjectName, r.ReqName, r.Risk, r.Priority, r.ReqDescription, r.ReqCreator, r.ReqCreated, r.ReqReviewed, r.ReqCoverageStatus, r.ReqRevision, r.MarkedAsObsolete, r.Obsolete, r.TreeOrder FROM RTM_V_Requirements r WHERE r.ReqID IN (SELECT DISTINCT ReqTreeNodeID_pk as id FROM TM_RequirementTreeNodes rtn WITH (NOLOCK) WHERE rtn.ProjectID_fk = 98 AND rtn.MarkedForDeletion=${reqProp_Obsolete_0|0} AND ParentTreeNodeID_fk IS NOT NULL)
149
RequID: Query for this column to enable a link to requirements on the Data tab of a report. TestDefID: Query for this column to enable a link to test definitions on the Data tab of a report. ExecDefID: Query for this column to enable a link to execution definitions on the Data tab of a report.
If the query's result includes both the ProjectID and either of RequID, TestDefID, or ExecDefID, using exactly these terms as column names, the Data tab will display the values in the element ID's column as a link. If you click such a link, Test Manager will switch to that element in the tree.
Bookmarking Reports
The bookmark button in the lower-right corner of the workflow bar bookmarks the currently displayed report, including the parameters that you have set in the Parameters tab. You can send bookmark URLs to other Test Manager users, allowing them to view reports with a single click. The bookmark URL contains the parameters, prefixed with rp_. Date values are represented as the correlating Long values in UTC in the URL. Related Concepts Report Generation Related Procedures Analyzing Test Results - Quick Start Task Creating New Reports Customizing Reports with BIRT Generating Reports Managing Reports Related Reference Reports Unit Interface
150
$TODAY
Gives the current systemdate (on the database server). You can also write $TODAY-1 (for yesterday) or $TODAY-7 (for a week ago) Returns the date (does not include the time) Converts the given string to a database date
Calculates the difference in days between the two given ${$DAYS[CreatedAt;$TODAY]} > 7 parameters. The two parameters can be a column within (returns the rows created within the last week) the table/view or $TODAY. Returns the week-number of the given parameter, which can be $TODAY or a column. Returns the month of the year as a number of the given parameter, which can be $TODAY or a column. Returns the year as a number of the given parameter, which can be $TODAY or a column. The ID of the currently logged in user. The name of the currently logged in user. The ID of the currently selected project.
$PROJECTNAME The name of the currently selected project. $REPORTNAME $REPORTID The name of the currently selected report. The ID of the currently selected report.
Sample Custom Report Below is the code of the pre-installed Requirement with Child Requirements report. With this report, a selected requirement is shown with its requirement ID. Full details regarding the requirements child requirements are displayed. Although not a custom report, this report is a helpful example because it makes use of the $PROJECTID function. It also includes two parameters, reqID (requirement ID) and reqProp_Obsolete_0 (show obsolete requirements).
SELECT r.ReqID, r.ReqCreated, r.ReqName, r.TreeOrder FROM RTM_V_Requirements r INNER JOIN TM_ReqTreePaths rtp ON (rtp.ReqNodeID_pk_fk = r.ReqID) WHERE rtp.ParentNodeID_pk_fk=${reqID|22322|Requirement ID} AND r.ProjectID = ${$PROJECTID} AND r.MarkedAsObsolete=${reqProp_Obsolete_0|0|Show obsolete Requirements} ORDER BY r.TreeOrder ASC
151
Related Concepts Report Generation Related Procedures Creating New Reports Creating Reports Generating Reports Managing Reports
152
Context-Sensitive Reports
The Requirements, Test Plan, and Execution units offer dynamically-generated lists of reports that are specific to each unit. Context-sensitive report lists are helpful because they offer report types that relate directly to your current activities.
Requirements unit: Context-sensitive report lists in the Requirements tree offer all reports that take
requirement ID as an input parameter. as an input parameter.
Test Plan unit: Context-sensitive report lists in the Test Plan tree offers all reports that take test-definition ID Execution unit: Context-sensitive report lists in the Execution Definition tree offer all reports that take
execution-definition ID as an input parameter. On the execution-definition Runs tab, context-sensitive report lists offer all reports that have the following configuration: Result category = Execution Definition Selection criteria = Execution Definition Run Property = ID When you select a report from a context-sensitive report list, you are taken directly to that report's default tab in the Reports unit. This default destination-tab behavior can be configured using each report's Edit Report dialog box. There are two types of reports that appear in the context-sensitive report lists: reports that you have already accessed and reports that you have not yet accessed. Reports that you have accessed previously appear above a line separator in the menu. These reports are listed chronologically with the most recently viewed report at the top of the list. Other default reports that are available, but have not yet been accessed, appear beneath the line separator. In addition to the default-configured context-sensitive reports, you can configure new and existing reports to be included in each unit's context-sensitive report list. Context sensitivity is added to reports on a per-user, per-report basis only. Related Concepts Report Generation Related Procedures Managing Reports Accessing Context-Sensitive Reports Enabling Context-Sensitive Reports
153
154
155
Requirements Reports
This section explains the requirements-related reports that ship withSilkCentral Test Manager. Requirements reports detail the status of functional requirements (for example, compatibility requirements, GUI requirements, feature requirements) that must be met during development. Requirements may also relate to product management objectives such as reliability, scalability, and performance. Test Managers requirement-management reports help managers determine if adequate test coverage has been established to verify that system requirements are met during development. When a report references a requirement that includes HTML-formatted content, that content is rendered in the report. The following reports come pre-installed with Test Manager. In This Section Status Reports Here are the status reports that are available for Test Manager's Requirements unit. Progress Reports Here are the progress reports that are available for Test Manager's Requirements unit. Document Reports Here are the document reports that are available for Test Manager's Requirements unit. All Related Issues Report Provides a detailed list of all issues related to the assigned test definitions for a requirement.
156
Status Reports
Here are the status reports that are available for Test Manager's Requirements unit.
157
Progress Reports
Here are the progress reports that are available for Test Manager's Requirements unit.
158
Document Reports
Here are the document reports that are available for Test Manager's Requirements unit.
All Requirements
All requirements are represented with full requirement information.
159
Input Parameters
The input parameter for an all related issues report is the identifier of the requirement.
Overview
The All Related Issues report is divided into the folllowing sections:
Requirement Information
This section provides the following information about the requirement: ID Name Description Test Coverage Nr. of Issues Identifier of the requirement. Name of the requirement. Description of the requirement. Status of all test definitions that have been assigned to the requirement. Amount of issues related to the requirement or sub-requirements of the requirement.
Related Issues
This table shows all issues related to the requirement or sub-requirements of the requirement. The detailed information provided for each issue is: ID Synopsis Status Assigned by Test ID Identifier of the issue. If an identifier is provided by the issue tracking system, this external identifier is used. The identifier is clickable if an external link is defined for the issue. Meaningful short-description of the issue. Current status of the issue. If the status is provided by the issue tracking system, this external status is used. Person who assigned the issue to the test definition. Identifier of the test definition in which the issue was discovered.
160
Test Definition Name of the test definition in which the issue was discovered. Related Concepts Run Comparison Reports Execution Reports Test Definition Run Comparison Report Related Procedures Generating Reports
161
162
Status Reports
Here are the status reports that are available for Test Manager's Test Plan unit.
163
Progress Reports
Here are the progress reports that is available for Test Manager's Test Plan unit.
Specific Test Plan Node Progress Over the Past 'X' Days
Represents a trend in test definition progress by considering a specific test plan node over the past 'X' number of days.
Percentage Testing Success Over the Past 'X' Days (per component)
Represents a percentage listing of successful test definitions over the last 'X' number of days per component; assists in identifying the components in the environment that are most critical. Related Concepts Test Plan Management Report Generation Related Procedures Analyzing Test Results - Quick Start Task Creating New Reports Customizing Reports with BIRT Generating Reports Managing Reports Related Reference Reports Unit Interface
164
165
Execution Reports
This section explains the execution reports that ship with SilkCentral Test Manager. To ease the assesment of results, execution reports give you a detailed overview of the progress of your test executions and the status of defects, over a period of time, or over a range of builds. The following reports come pre-installed with Test Manager. In This Section Run Comparison Reports Describes the run-comparison reports that are available in Test Manager. Execution Definition Run Comparison Reports Compare two runs of an execution definition. Test Definition Run Comparison Report Compares two runs of a test definition. Execution Definition Run Errors Report Provides a detailed list of all test definitions that did not pass an execution definition run and the reason they did not pass.
166
167
The default execution-definition run-comparison report that compares two runs of the execution definition. Execution Definition Run Comparison Report Failed in Compares only the failed tests of two executionNewer Run definition runs. Execution Definition Run Comparison Report Compares only those tests of two execution-definition Changed Status runs, that changed their statuses.
Overview
The execution-definition run-comparison report provides the following details:
Changes to the status of the execution definition Number of errors Number of warnings Context in which the execution definition was executed Execution duration of the assigned tests
Note: When the status of an assigned test changes to Failed between compared runs, the test is marked red. When the status of an assigned test changes to Passed between compared runs, the test is marked green. The execution-definition run-comparison report includes the folllowing sections:
General Report Information Execution Definition Information Execution Definition Run Comparison Test Definition Run Comparison
168
Description Description of the execution definition. Product Name of the product specified for the run.
Related Concepts Run Comparison Reports Execution Reports Test Definition Run Comparison Report Related Procedures Generating Reports
169
Changes to the status of the test definition Number of errors Number of warnings Context in which the test definition was executed Execution duration of the assigned tests Attributes and properties of the test definition Parameters of the test definition Success conditions for the test definition
The test-definition run-comparison report is divided into the following sections.
Execution Information
This section provides the following information about each execution: Execution Definition ID Execution Definition Name Run ID Product Version Build ID of each execution definition. Name of each execution definition. ID of each execution definition run. Name of the product. Version of the product. Build of the product.
Status Execution Timestamp Duration Errors Warnings Previous Status Changed by Change Comment
Status of each run. Timestamp of each run. Duration of each run. Number of errors in each test definition run. Number of warnings in each test definition run. Status of each run previous to the last manual change. User who performed the last manual change to the status. Describes the reason of the manual status change.
Parameters
This section lists the parameters of the two runs of the test definition at execution time.
Success Conditions
This section lists the conditions at execution time for each of the two runs to be considered successful. If a condition is not satisfied, the test definition run is considered unsuccessful. Satisfied conditions are marked green, while unsatisfied conditions are marked red. Related Concepts Run Comparison Reports Execution Reports Execution Definition Run Comparison Reports Related Procedures Generating Reports
171
Input Parameters
The input parameter for an Execution Definition Run Errors report is the identifier of the execution-definition run.
172
173
174
Input Parameters
The input parameters for a code coverage trend report are: product_ProductVersion Version of the selected product. BuildFrom First build in the range of examined builds. BuildTo Last build in the range of examined builds
175
Input Parameters
The input parameters for a method coverage comparison report are: Build 1 Build 2 Product Threshold Number of the first build that is to be compared. Number of the second build that is to be compared. The examined product. The minimum amount of change that results in a package appearing in the report. Packages with a smaller percentage of change are not shown in the report. The threshold range is from 0 to 100 percent.
176
177
Input Parameters
The input parameters for an Average Page-Time Trend report are: Date From (DD-MON-YYYY) Starting date for the time range. For example: 06-DEZ-2008. Date To (DD-MON-YYYY) End date for the time range. For example: 16-JAN-2009. Exclude Runs with more than <nnn> Errors Runs that generate more errors than specified here are not included in the report. Use this setting to avoid that outliers skew the trend curve. Maximum Value for y-Axis Limits the y-axis of the graph to the specified value. Page-times that exceed this value are cut off at the top. This setting is useful to prevent the flattening of lines caused by outliers. Measure Filter Shown measures are limited to those including the specified string in their name. This field has to be filled out. To display all available measures, set the measure filter to "%". For example, to show only measures that include the word "unit" at any position in their names, set the measure filter to "%unit%". Test Definition ID Identifier of the test definition for which you want to view the report.
178
Input Parameters
The input parameters for an Average Transaction Busy-Time Trend report are: Date From (DD-MON-YYYY) Starting date for the time range. For example: 06-DEZ-2008. Date To (DD-MON-YYYY) End date for the time range. For example: 16-JAN-2009. Exclude Runs with more than <nnn> Errors Runs that generate more errors than specified here are not included in the report. Use this setting to avoid that outliers skew the trend curve. Maximum Value for y-Axis Limits the y-axis of the graph to the specified value. Transaction busy-times that exceed this value are cut off at the top. This setting is useful to prevent the flattening of lines caused by outliers. Test Definition ID Identifier of the test definition for which you want to view the report. Transaction Filter Shown transactions are limited to those including the specified string in their name. This field has to be filled out. To display all available transactions, set the transaction filter to "%". For example, to show only transactions that include the word "unit" at any position in their names, set the transaction filter to "%unit%".
179
Related Concepts Performance Trend Reports Average Page-Time Trend Report Overall Page-Time Trend Report Overall Transaction Busy-Time Trend Report Custom Measure Trend Report
180
Input Parameters
The input parameters for an Custom Measure Trend report are: Date From (DD-MON-YYYY) Starting date for the time range. For example: 06-DEZ-2008. Date To (DD-MON-YYYY) End date for the time range. For example: 16-JAN-2009. Exclude Runs with more than <nnn> Errors Runs that generate more errors than specified here are not included in the report. Use this setting to avoid that outliers skew the trend curve. Maximum Value for y-Axis Limits the y-axis of the graph to the specified value. Measures that exceed this value are cut off at the top. This setting is useful to prevent the flattening of lines caused by outliers. Measure Name Name of the custom measure for which you want to view the report. For example: CreateTestDefinition Measure Type Type of the custom measure. For example: Transaction (BusyTime)[s] Test Definition ID Identifier of the test definition for which you want to view the report.
181
Related Concepts Performance Trend Reports Average Page-Time Trend Report Overall Transaction Busy-Time Trend Report Overall Page-Time Trend Report Related Reference Average Transaction Busy-Time Trend Report
182
Input Parameters
The input parameters for an Overall Page-Time Trend report are: Date From (DD-MON-YYYY) Starting date for the time range. For example: 06-DEZ-2008. Date To (DD-MON-YYYY) End date for the time range. For example: 16-JAN-2009. Exclude Runs with more than <nnn> Errors Runs that generate more errors than specified here are not included in the report. Use this setting to avoid that outliers skew the trend curve. Maximum Value for y-Axis Limits the y-axis of the graph to the specified value. Page-times that exceed this value are cut off at the top. This setting is useful to prevent the flattening of lines caused by outliers. Test Definition ID Identifier of the test definition for which you want to view the report.
183
Input Parameters
The input parameters for an Overall Transaction Busy-Time Trend report are: Date From (DD-MON-YYYY) Starting date for the time range. For example: 06-DEZ-2008. Date To (DD-MON-YYYY) End date for the time range. For example: 16-JAN-2009. Exclude Runs with more than <nnn> Errors Runs that generate more errors than specified here are not included in the report. Use this setting to avoid that outliers skew the trend curve. Maximum Value for y-Axis Limits the y-axis of the graph to the specified value. Transaction busy-times that exceed this value are cut off at the top. This setting is useful to prevent the flattening of lines caused by outliers. Test Definition ID Identifier of the test definition for which you want to view the report.
184
185
Unique key: Test definition + Execution definition Project name Test name Test plan hierarchy Execution definition Test type Duration of test Status of test (passed, failed, not executed), cumulative across all runs of build range Last build executed # Times executed for this version # Times passed for this version + # Times failed for this version Coverage index: Methods covered by the test for the specified classes / total methods of specified classes. Time stamp Test creator Test executor (manual tester or execution server)
Project name Execution definition name # Manual tests # Automated tests # Manual tests in coverage path # Automated tests in coverage path Duration of manual tests Duration of automated tests
186
Duration of manual tests in coverage path Duration of automated tests in coverage path
Select a particular class. Select and execute the Code Change Impact - Test Definitions report. Observe the list of tests that cover the classes that were touched in this version.
Effort analysis: You want to know how many hours of automated and manual testing will be required to properly cover a particular set of changes to the code.
Select a particular class . Select and execute the Code Change Impact - Execution Definitions report. Observe the required time (cost) for automated and manual tests.
Related Concepts Code Coverage Analysis Report Generation Related Procedures Analyzing Test Results - Quick Start Task Creating New Reports Customizing Reports with BIRT Generating Reports Managing Reports Related Reference Reports Unit Interface
187
188
189
You must start the application under test before The application under test must be started before code coverage execution you start SilkCentral Test Manager is kicked off. The AUT must be started independently of SilkCentral Test Manager so that code coverage can hook into it. Java prerequisites To find out which classes and methods of an application under test (AUT) have been invoked within jar files: The file sctmcc.dll must be located in the directory from which the AUT will be executed. This file can be downloaded directly from your Test Manager GUI by selecting Help Tools Code Analysis Instrumentation Library. Test Managers code analysis works with Java versions 1.4, 1.5, and 1.6. However, there is a difference in how the DLL can be loaded when starting up the virtual machine (VM). When starting up the application under test from the command line, the following arguments must be passed to the VM in order to load sctmcc.dll: Java 1.4 -Xrunsctmcc:port=19129,<options> Java 1.5 and 1.6 -agentlib:sctmcc=<options> See the following sections of this topic for more information about finding out which classes and methods of an application have been invoked within jar files.
port=19129 Port of code coverage service. coveragetype="line" Possible values are "line" or "method". "line" must be used for getting code coverage
information with Test Manager.
coveragepath={"library1.jar";library2.jar"} Jar-libraries to monitor for code coverage information. name="ServerName" Name of the monitored application.
190
Java 1.4:
C:\Java\j2sdk1.4.2_06\bin\java.exe -Xrunsctmcc:port=19129,coveragetype="line",coveragepath= {"C:\Program Files\Borland\SilkTest\JavaEx\JFC\swingall.jar"; "C:\Program Files\Borland\SilkTest\JavaEx\JFC\Swing11TestApp.jar"},name="Test Application" -Dsilktest.tafont=arialuni.ttf -cp .;%FontDir% ta
You must configure Test Manager to gather code coverage data from an application under test. This can be done for any number of execution definitions listed on Test Managers Deployment tab, in the Execution unit. Related Concepts Code Coverage Analysis Related Procedures Analyzing Test Results - Quick Start Task Analyzing Code Coverage Related Reference Code Analysis Unit Interface
191
192
Results Compilation
Once an execution definitions test executions are complete, you can view its results. You will notice that there is a new result file for the execution definition called FullCoverageInfo.xml and an additional CodeCoverageInfo.xml file for each test definition result. Test Manager uses these result files to aggregate and calculate all code analysis data. Note: Aggregated data is not immediately available and calculations may take time to compile. Related Concepts Code Coverage Analysis Related Procedures Analyzing Test Results - Quick Start Task Analyzing Code Coverage Related Reference Code Analysis Unit Interface
193
194
195
Procedures
This section explains all of the procedures associated with using Test Manager. In This Section Quick Start Tasks Quick Start Tasks are high-level overviews of the main tasks that you will likely need to perform with SilkCentral Test Manager. Managing a Successful Test This section explains all of the procedures that you need to know to manage tests with Test Manager.
196
197
Create a new report. Creating New Reports Edit your report's properties. Editing Report Properties Edit your report's parameters. Editing Report Parameters Optionally, write advanced SQL queries for your report. Writing Advanced Queries with SQL Optionally, customize a report template to meet your needs. Customizing BIRT Report Templates Add subreports to your report. Adding Subreports Generate your report for viewing. Viewing Reports Generate a chart for viewing. Displaying Charts Optionally, generate a code-change impact report. Generating Code-Change Impact Reports
Related Concepts New Reports Requirements Reports Test Plan Reports Report Generation Code Coverage Analysis Execution Reports Related Procedures Creating Reports Generating Reports Managing Reports
198
Click Reports on the workflow bar. In the Reports directory tree, select the folder in which you want the new report to appear. This determines where the report will be stored in the directory tree. Click New Child Report on the toolbar. On the Create New Report dialog box, enter the name of the new report. This is the name that will appear in the Reports tree. Check the Share this report with other users check box if you want to make this report available to other users.
3 4 5
In the Timeout [s] field, specify the maximum time period in seconds that Test Manager should wait for SQL queries to complete. From the Default tab list box, select the tab that you want to be directed to when you select this report from one of the context-sensitive report lists. Select the corresponding result type from the Result Category list box. This setting specifies the database table and view that is to be filtered for the report. The following result types are available:
Requirement - Returns requirements available in the requirements module that meet the query criteria. Test Definition - Returns test definitions available in the Test Plan module that meet the query
criteria.
Test Definition Execution - Returns executed test definition results from the Executions module
that meet the query criteria.
Execution Definition - Returns execution definitions from the execution module. Issue - Returns issues (including imported issues). Requirement Progress Builds - Contains information on requirements progress per build so that
you can see how requirements develop across builds. on a daily basis.
Requirement Progress Days - The same as Requirement Progress Builds, but shows development Test Definition Progress Builds - Shows how test definitions develop across builds. Test Definition Progress Days - Same as above, but shows development on a daily basis.
Each result type offers a set of selection criteria. Based on the Result Type you have selected, specify an appropriate Selection Criteria for your report. These criteria typically group properties based on a view or some other intuitive grouping (for example, custom properties).
6 7
From the Property list box, select the property that is to be filtered on. For some selection criteria, properties are dynamic. Select an Operator for the query. The available operators depend on the property. Example operators are =, not, like, not like. Strings are always compared lowercase. Allowed wildcards for strings are * and ? (where * matches any characters and ? matches exactly one character). Select or specify the Value that the query is to be filtered on. For date-based properties, the Value field is replaced with a calendar tool that you can use to select a specific date. (optional) To add an additional query string to this report, click More. An existing query string can be deleted by clicking the strings Delete button. When multiple query strings are defined, AND and OR radio buttons appear next to the More button. Use these option buttons to define if the queries should be considered cumulatively (ALL), or if only one query strings criteria needs to be met (OR).
199
Click Next to configure report columns on the New Report dialog box.
To create columns:
1
Click Add Columns to display the Add Columns dialog box. All available report columns are listed. Select those that you want to have included in the report and click OK (multiple columns can be selected by holding down the CTRL key). Note: For test-planning reports, the list of available column names is enhanced with the column names from the LQM_v_tests table. See SilkCentral database documentation for full details.
The selected columns appear in tabular format on the New Report dialog box. From here you can configure how each report column is to be displayed. For each column, specify a sort direction (ascending, descending, or unsorted) using the up/down arrows in the Sorting column. When a column is selected for sorting, a list box is displayed in the Sort Order column that allows you to more easily edit the column-sort order. Set these numbers as required. Give each column an Alias. This is the name by which each column will be labeled in the generated report. With grouping, you can take advantage of SQL aggregation features (for example, selecting a number of elements or querying a total sum of values). Check the Group by check box on the column selection dialog to specify that SQL group by functions are to be applied. Columns that are not selected for SQL group by functions are set to aggregation by default (meaning, a single aggregate value will be calculated). From the Aggregation list box, select the appropriate aggregation type (Count, Sum, Average, Minimum, or Maximum). The Actions column enables you to move column listings up and down in the view. The Move Up and Move Down functions do not affect the outcome of the report. Note: Any report column can be deleted by clicking the columns Delete button.
3 4 5
Related Concepts New Reports Report Generation Related Procedures Analyzing Test Results - Quick Start Task Creating Reports Customizing Reports with BIRT Generating Reports Using Context-Sensitive Reports Managing Reports Related Reference Reports Unit Interface
200
Click Reports on the workflow bar. Select the report in the Reports tree. On the Properties tab, click Edit. The Edit Report dialog box displays. Modify the Name and Description of the report as required. Ensure that the Share this report with other users check box is checked if you intend to have this report shared with other users. From the Default tab list box, select the tab that you want to be directed to when you select this report from one of the context-sensitive report lists. Specify one of the following options to indicate how the report can be edited:
Simple report: You can modify the Selection criteriathus changing the results of the selected report
or you can click Advanced Query to modify the SQL query code.
Advanced report: If you have familiarity with SQL, you may edit the query code in the Report data
query field. To assist you in editing SQL queries, a list box of function placeholders (for example, variables) is available. To insert one of the available pre-defined functions, select the corresponding placeholder from the Insert placeholder list box. Note: If you manually edit the SQL code for the query, upon finishing, click Check SQL to confirm your work.
Related Concepts Report Generation Related Procedures Creating New Reports Customizing BIRT Report Templates Creating Reports Generating Reports Managing Reports Related Reference Report Properties tab
201
Click Reports on the workflow bar. Select a report in the Reports tree. Click the Parameters tab. If the report has parameters defined for it, the parameters will be listed there. Click Edit Parameters. The Edit Parameters dialog box displays. Edit the Label or Value of the listed parameters as required. From the Usage field, select the usage type of the parameter (constant value, start time, end time). Click OK to save your changes.
Related Concepts Report Generation Related Procedures Creating New Reports Creating Reports Generating Reports Managing Reports Related Reference Report Parameters tab
202
Click Reports on the workflow bar. In the Reports directory tree, select the folder in which you want the new report to appear (Requirements, Test Plan, Issues, etc.). This determines where the report will be stored in the directory tree. Click Create New Report on the toolbar. On the Create New Report dialog box, enter the name of the new report. This is the name that will appear in the Reports tree. Check the Share this report with other users check box if you want to make this report available to other users. Enter a description of the report in the Description field. Click Advanced to open the Report data query field. Insert previously written code as necessary, or write new code directly in the field. To assist you in writing SQL queries, a list box of Test Manager function placeholders is available. See the following section for details regarding available placeholders. To insert one of the available pre-defined functions, select the corresponding placeholder from the Insert placeholder list box. Note: If you manually edit SQL code for the query, click Check SQL to confirm your work.
Once you have completed editing the reports properties, click Finish to save your settings.
Related Concepts New Reports Report Generation Related Procedures Analyzing Test Results - Quick Start Task Creating Reports Customizing Reports with BIRT Generating Reports Using Context-Sensitive Reports Managing Reports Related Reference Reports Unit Interface
203
Select a report that utilizes the BIRT Report Template from Test Manager Select the Properties tab. Click Download BIRT Report Template.
You receive the report data as a generic BIRT report template (empty). The datasource is already configured. Once you have saved the template to your local system, modify it as required. Once complete, upload it using the Upload link on the Report tab. For detailed information on configuring BIRT report templates, please refer to the SilkCentral Administration Module Help.
Related Concepts New Reports Report Generation Related Procedures Analyzing Test Results - Quick Start Task Creating Reports Customizing Reports with BIRT Generating Reports Managing Reports Related Reference Report Properties tab
204
Adding Subreports
To aggregate the results from multiple reports into the currently selected report, you can add subreports. When adding a report as a subreport, the result columns and rows of the subreport are concatenated to the results of the selected report.
Click Reports on the workflow bar. Select a report in the Reports tree. On the Properties tab, click Add Subreport. The Add Subreport dialog box displays. Select the subreport you want to have appended to the current report by selecting it from the Reports tree-list. Click OK to complete the addition of the subreport. Subreports appear on the associated reports Properties tab in a section called Subreports.
Related Concepts Report Generation Related Procedures Creating New Reports Customizing BIRT Report Templates Creating Reports Generating Reports Managing Reports Related Reference Report tab
205
Viewing Reports
Because each template expects a certain data format to produce a useful graph, not all templates can be applied to all report queries. You will receive an error message if you attempt to generate a report through an incompatible report template. For example, selecting the Four Values Per Row As Horizontal Bar template to display the Requirements Status Overview report works because this particular Microsoft Excel template requires exactly the four values (failed, passed, not executed, and not covered) that the report query delivers.
To generate a report
1 2 3 4 5 6 7
Click Reports on the workflow bar. In the Reports tree, select the report you want to generate. Select the Report tab. Click the Select Report Template icon. From the Select Report Template dialog box, select the template you wish to use. Click OK to display the report. (optional) If necessary, select an alternate view magnification for the report from the list box. 100% is the default magnification. Other options are 50%, 75%, 150%, and 200%.
Related Concepts Report Generation Related Procedures Analyzing Test Results - Quick Start Task Context-Sensitive Reports Generating Reports Managing Reports Related Reference Report tab
206
Displaying Charts
To display a chart
1 2 3 4 5 6 7
Click Reports on the workflow bar. Select a report in the Reports tree for which you want to view a chart. Select the Chart tab to display the default chart. To select a required chart type, click the Select Chart Type icon. On the Select Chart Type dialog box, select a chart type. Select the view properties that you want to apply to the chart (3D view, Show horizontal grid lines, Show vertical grid lines, and Show legend). Specify how these chart options are to be saved:
Select For current user only to have these chart settings override the reports standard settings whenever
the current user views this chart.
Select As report standard to have these chart settings presented to all users who dont have overriding
user settings defined. This setting does not affect individual user settings.
8
Click OK to display the new chart type. Note: Note: The chart configurations you define here become the defaults for this report. When standard charts and graphs are not able to deliver the specific data that you require, or when they cannot display data in a required format, you can customize the appearance of queried data using the Test Manager reporting functionality. To open the current chart in a separate browser window, click the Open in new window icon at the top of the Chart tab.
Note:
Related Concepts Report Generation Related Procedures Customizing BIRT Report Templates Displaying Charts Creating Reports Generating Reports Managing Reports Related Reference Report Chart tab
207
Click Projects on the workflow bar. Select the project for which you want to analyze code-coverage data. Click Code Analysis on the workflow bar. Click Create Code Change Impact Report on the main toolbar. The Select Classes for Report dialog box displays, select a Product and Version, if you want to change the pre-selected values. In the Filter field, enter criteria to filter the packages. For example, entering the string published will only list packages that contain the string published in their names. Select a package from the Packages pick list. You can select multiple packages by holding down the CTRL key while clicking listed packages. The classes that are available in the selected package appear in the Classes pick list. Select a class file that you want to have included as a source in your report. You can select multiple classes by holding down the CTRL key while clicking listed classes. Click Add to add the class file(s) to the Selected classes pick list. You can remove classes in the Selected classes pick list by selecting entries and clicking Remove. Click Remove All to remove all selected classes from the Selected classes pick list.
6 7
Repeat the preceding steps Select a package from the Packages pick list through Click Add to add the class file(s) to the Selected classes pick list until you have added all required classes to the Selected classes list. Select a report from the Select report list box.
Related Concepts Code-Change Impact Reports Report Generation Test Manager Code Analysis Code Coverage Analysis Related Procedures Analyzing Test Results - Quick Start Task Enabling Code Analysis for Execution Definitions Analyzing Code Coverage Related Reference Code Analysis Unit Interface
208
Configure settings for your project. Configuring Project Settings Create custom attributes. Creating Custom Attributes Create global filters. Creating Global Filters Configure change notification. Enabling Change Notification Create custom step properties. Creating Custom Step Properties
Related Concepts Settings Configuration Related Procedures Configuring Test Manager Settings
209
Settings .
If you have not already selected a project, a warning message will appear, asking you to select a project. Select the project for which you want to define global settings.
2 3 4
Select the Project Settings tab to view the current settings. The Project Settings page displays the current project settings. Click Edit to modify the current project settings. The Edit Project Settings dialog box displays. You can specify the following information:
Build Information File Name Build information files contain project information, including build number, Project Release Date Enter your projects planned release date in the format MM/DD/YYYY. File Extensions to ignore in Results Specify result file types or other file types that should not be
saved as results for test executions. Note: Note: File extensions must be separated by commas (for example, xlg, *_, res). Changes made in the Build Information File Name and File Extensions to ignore in Results fields will not affect scheduled test definitions. To redistribute tasks to execution servers, you must reschedule test definitions, or disconnect from and reconnect to the database.
build log location, error log location, and build location. Enter the name of your projects build information file in this field. All test executions will read the build information from this specified file.
Related Concepts Settings Configuration Related Procedures Configuring Projects - Quick Start Task Configuring Test Manager Settings Related Reference Settings Unit Interface
210
Settings.
If you have not already selected a project, a warning message displays, asking you to select a project. Select the project for which you are defining custom attributes.
2 3 4 5 6 7
Select the Attributes tab to view the list of current attributes. Click New Attribute. The New Attribute dialog box displays. Enter a Name for the new attribute. This name will be displayed in list boxes when the attribute becomes available for use. Enter a Description of the new attribute. Select the Attribute type. Depending on the attribute type you have selected, you can now continue as follows: If you have selected the attribute type Edit, you can now click OK to save the new custom attribute, or click Cancel to abort the operation. If you have selected the attribute type Normal or Set, you can define values. To define a new value, click New Value. Enter the value into the Value field on the New Value dialog box and click OK. The new value is then listed in the Value table, where you can edit it by clicking the name of the value; or you can delete it by clicking on the Delete icon. Click OK to save the new attribute; or click Cancel to abort the operation. You will be returned to the Attributes list; the new attribute is now listed.
Related Concepts Attributes Settings Configuration Related Procedures Configuring Projects - Quick Start Task Configuring Test Manager Settings Related Reference Attributes tab
211
Settings .
If you have not already selected a project, a warning message will appear, asking you to select a project. Select the project for which you are defining global settings.
2 3 4 5
Select the Filters tab to view the list of available filters. Click New Filter. The New Filter dialog box displays. Enter a Name for the new filter. This name will be displayed in list boxes when the filter becomes available. Select a Category for the new filter from the list box to make the filter available in a specific Test Manager unit:
Requirement Filter The filter will be available in the Requirements Management unit. Test Definition Filter The filter will be available in the Test Plan Management unit. Execution The filter will be available in the Test Execution Management unit.
6 7
Enter a Description of the new filter. Select a category of filter criteria (Selection criteria). The available categories depend on the general filter category you have selected. You can also combine filters by selecting Nested Test Definition Filter or Nested Requirements Filter. Selecting one of these categories allows you to include an existing test definition filter (for example, an existing requirements filter) in your new filter.
Select a Property, Operator, and Value for the new filter from the respective list boxes.
Property Available properties depend on the filter category that you have selected in the previous step.
It defines the property for which you are defining a filter setting. If you have selected an attribute category, the property list includes custom attributes to query against.
Operator Specifies the filter operator. The operator depends on the property type you have selected.
For example, if you have selected a property that is based on a string field type, the available operators are = (equals defined value) not (differs from the defined value) contains (contains the defined value somewhere in the string) not contains (does not contain the defined value in the string) values will either be strings that you can enter into the text box, or a selection of predefined values that you can select from the list box.
Value Enter the value that you want to filter out. Depending on the property type that you have selected,
Click More if you want to add more than one filter category to the new filter. Repeat this procedure to define new categories. If you define more than one filter category, you must define whether the categories need to be fulfilled in addition to the existing categories (AND relationship), or if the filter returns true when any of the filter categories are fulfilled (OR relationship). Select either AND or OR to define the filter category relationship. You cant define nested AND, OR relationships. To remove filter categories, click Fewer. This removes the last filter category. When you are done, click OK to save the new filter, or click Cancel to abort the operation.
212
Related Concepts Global Filters Settings Configuration Related Procedures Configuring Projects - Quick Start Task Configuring Test Manager Settings Creating Filters Related Reference Filters tab
213
Settings.
Click Configure Notification to open the Configure Change Notification dialog box. If you want to be notified by email when changes are made to requirements in the currently selected project, check the Changes on Requirements check box. If you want to be notified by email when changes are made to test plans within the currently selected project, check the Changes on Test Plan check box. Click OK to save the notification settings, or click Cancel to abort the operation without saving changes. You will be notified by email about the changes you have activated.
Related Concepts Change Notification Settings Configuration Related Procedures Configuring Projects - Quick Start Task Configuring Test Manager Settings Related Reference Notifications Page
214
Settings .
Select the Step Properties tab. Click New Property to display the New Custom Step Property dialog box. Enter a name for the new property in the Name field. Note: Custom step property fields are always declared as type string.
Click OK to make your custom property available to all manual test steps in the selected Test Manager project.
Related Concepts Settings Configuration Related Procedures Configuring Projects - Quick Start Task Custom Step Properties Configuring Test Manager Settings Related Reference Step Properties Page
215
To manage requirements:
1
If you are using an external requirements-management tool, configure integration for the tool. Note: See related procedures below for information on configuring integration with your requirements-management tool.
Create your requirements. Creating Requirements If you have integrated a requirements-management tool, configure the requirement type of your requirements. Configuring Requirement Types Attach files to your requirements. Attaching a File to a Requirement Create custom filters for your requirements. Creating Filters Create advenced custom filters for your requirements. Creating Advanced Filters Generate a test plan from your requirements. Generating Test Plans from Requirements View
Related Concepts Requirements Management Related Procedures Integrating External RM Tools Enabling Integration with Borland CaliberRM Enabling Integration with IBM Rational RequisitePro Enabling Integration with Telelogic DOORS Managing Requirements
216
Creating Requirements
Test Manager allows you to create new requirements, edit and delete existing requirements, and add custom property fields to requirements. Newly created Test Manager projects do not contain requirements.
Requirements.
Click New Requirement on the toolbar. Note: If the project you are working with does not yet have any requirements associated with it, click the <Click here to add Requirements> link in the Requirements tree to open the New Requirement dialog box.
On the New Requirement dialog box, enter a meaningful Name and Description for the requirement. Note: Test Manager supports HTML formatting and cutting and pasting of HTML content for description fields.
4 5
Select the appropriate Priority, Risk, and Reviewed status from the list boxes. If custom requirements habe been defined, enter in the Custom Property text box any custom property data that you want tracked with this requirement. Note: The Priority, Risk, Reviewed, and any Custom Property fields will be configured automatically with the corresponding properties of the parent requirement if you check the Inherit from parent check boxes for these properties.
Click OK to create a new top-level requirement. Note: Alternatively, you can click OK and New Requirement to both save the newly created requirement and open the New Requirement dialog box to create an additional top-level requirement. Or, you can click OK and New Child Requirement to have the New Child Requirement dialog box open after the new top-level requirement is created.
Related Concepts Requirements Management Related Procedures Managing Requirements - Quick Start Task Creating Child Requirements Managing Requirements Related Reference Requirements Unit Interface HTML Support for Description Text Boxes
217
Click Requirements on the workflow bar. Note: Configuration of requirement type for CaliberRM, Requisite Pro and DOORS is only enabled for top-level requirements in the tree (requirements that are a direct child of the project node). All other requirements share the requirement type of their parents. A requirement without a configured requirement type is not available for upload. Import of requirements automatically configures appropriate requirement type.
2 3
From Requirements View, at the requirement level, select the Properties tab. Click Map Requirement to select a requirement type from the list. Requirement type is a categorization used by CaliberRM, Requisite Pro, and DOORS and is required for synchronization. Note: Map Requirement is only enabled when external requirements integration is enabled in the Settings unit (Integrations Configuration tab.) and if the requirement has not yet been uploaded to the external requirements management tool. Additionally, the option Enable upload of requirements to... must be enabled.
Related Concepts Requirements Management Related Procedures Managing Requirements - Quick Start Task Test Coverage Status Managing Requirements Related Reference Requirement Properties tab
218
Click Requirements on the workflow bar. Select a requirement in the Requirement tree view. Select the Attachments tab. When requirements management integration has been enabled between a Test Manager project and a CaliberRM project, the Attachments tab (Requirements Attachments) includes an Open CaliberRM button, which enables you to manage requirement attachments directly in CaliberRM.
4 5 6 7
Click Upload File to open the Upload File dialog box. Using Browse, select a file from your local file system. Enter a meaningful Description for the attachment. Click OK to upload the attachment to the server and associate it with the selected requirement.
Note: Attaching files to a test plan element may not work in Mozilla Firefox. Firefox requires usage of three slashes (for example: "file:///") for a file link, while other browsers require only two (for example: "file://"). Aditionally, Firefox includes a security feature blocking links from remote files to local files and directories. For more information, see http://kb.mozillazine.org/Firefox_:_Issues_:_Links_to_Local_Pages_Don't_Work Related Concepts Attachments Requirements Management Related Procedures Managing Requirements - Quick Start Task Managing Requirements Related Reference Requirement Attachments tab
219
Creating Filters
To create a new custom filter:
1 2 3 4
Click the appropriate button (Execution, Requirements, or Test Plan) on the workflow bar. Click New Filter on the toolbar to display the New Filter dialog box. From the Property list box, select the property on which you wish to base the new filter (for example, Name, Description, Priority, Version and Build). From the Operator list box, select a logical operator to be applied to the specified property (for example, =, not, >, >=, <, <=, contains, anddoes not contains). Note: The contents of the Operator and Value list boxes vary based on the attribute selected in the Property field.
In the Value field, enter the value that the specified property is to be compared against. Note: For date-based properties, the Value field is replaced with a calendar tool that you can use to select a specific date.
6 7 8 9
Click Save and apply to open the Edit Filter dialog box. To apply the filter to the current view without saving the filter settings, click Apply. On the Edit Filter dialog box, enter a name for the filter in the Name field. Enter a meaningful description for the filter in the Filter field. Click OK to save the filter with your project.
Related Concepts Filtering Related Procedures Creating Advanced Filters Creating Global Filters Working with Filters
220
Click the appropriate button (Execution, Requirements, or Test Plan) on the workflow bar. Create a new custom filter. After you have defined your first filtering rule, click Advanced to open the Edit Filter dialog box.
Enter a name for the filter in the Name field. 5 Give the filter a meaningful Description.
6 7 8 9
Click More to display a second set of filter-parameter fields with which you can define a second set of filter parameters. Select a logical operator for the application of the filtering queries. For example, filtered elements must meet both sets of criteria (and), or filtered elements must meet one, but not both, of the criteria sets (or). To delete a filter-parameter string, click the corresponding Delete button. To display additional filter-parameter fields and create additional filter queries, click More. To remove excess filter-parameter sets, click Fewer.
Related Concepts Filtering Related Procedures Creating Filters Creating Global Filters Working with Filters
221
Click Requirements on the workflow bar. From Requirements View, with at least one requirement available in the Requirements tree, right-click the requirement or project node that is to be converted into a Test Plan tree. Select Generate Test Plan to display the Generate Test Plan from Requirements dialog box. This dialog box enables you to specify whether the leaves (lowest-level nodes) of the selected requirements subtree should be converted into test definitions or test folders; and whether the tree should be generated into a new test container or an existing container. Enter a name for the new test container in the Enter Name field and select a product from the Select Product list box to create the container within the active Test Manager project. The Select Product list box is populated with the products that are configured by a project manager. See SilkCentral Administration Module documentation or ask your project manager for detailed information. If you have defined a source control profile (see SilkCentral Administration Module documentation or ask your Test Manager administrator for detailed information) select the source control profile you want to use for managing the test definition sources from the Select Source Control Profile list box. To include all child requirements of the selected requirement in the test plan, check the Include child requirements check box (the default). To have the new test definitions that you generate automatically assigned to the requirements from which they are created, check the Assign newly generated Test Definitions to Requirements check box. If this option is not selected, test definitions must be manually associated with requirements. Note: This option is not available when checking Generate Test Folders from Requirement Tree leaves.
2 3
4 5 6
7 8
Click OK to create the test plan, which has the same structure as the Requirements tree. A message displays, asking if you want to switch directly to the Test Plan unit. Click Yes to view the test plan in Test Managers Test Plan unit, or click No to remain in the Requirements unit.
Related Concepts Test Plan Generation Requirements Management Related Procedures Managing Requirements - Quick Start Task Managing Requirements Related Reference Requirements Unit Interface
222
Create your execution definitions. Adding Execution Definitions Manually assign test definitions to your execution definitions. Manually Assigning Test Definitions to Execution Definitions Assign test definitions from Grid View to your execution definitions. Assign Test Definitions from Grid View to Execution Definitions Assign test definitions to your execution definitions using a filter. Using a Filter to Assign Test Definitions to Execution Definitions Create execution schedules for your execution definitions. Creating a Custom Schedule for an Execution Definition Configure setup and cleanup execution definitions. Configuring Setup and Cleanup Executions Add execution dependencies. Adding Dependent Execution Definitions Assign execution servers to your execution definitions using hardware-provisioning keywords. Assigning Keywords to Execution Definitions Execute your tests. Executing Individual Tests
Viewing Test Execution Details Related Concepts Execution Definitions Execution Dependency Configuration Setup and Cleanup Test Definitions Test Definition Execution Related Procedures Manual Test Execution Executing Manual Tests Working with SilkPerformer Projects
223
Click Execution on the workflow bar. Select an existing folder in the Execution tree, or select the project node. Click New Execution Definition on the toolbar (or right-click within the Execution tree and choose New Child Execution Definition ). The New Execution Definition dialog box displays. Enter a name and meaningful description for the execution definition. Note: Test Manager supports HTML formatting and cutting and pasting of HTML content for Description fields.
Select a test container from the Test Container list box. The Version and Build that are associated with the product that the container is associated with are then populated automatically in the Version and Build fields. You may only associate one test container to a test execution. Select a product Version and Build from the list boxes. If a build information file is available on the execution server, you have the option to check the Read from Build Information file check box, in which case build and information will be read from the build information file for the test run, overriding any manual settings that have been selected on the New Execution Definition dialog box. Specify a Priority for the execution definition from the list box (Low, Normal, or High). In the Source Control Label field you can optionally specify that the execution definition be of an earlier version than the latest version. Click OK to update the Execution tree with the newly created execution definition.
7 8 9
Related Concepts Test Definition Execution Execution Definition Schedules Build Information Related Procedures Managing Test Executions - Quick Start Task Execution Definitions Working with Execution Definitions Executing Test Definitions Creating an Execution Definition in Grid View Related Reference Execution Unit Interface HTML Support for Description Text Boxes
224
Test Definition Name Test Definition Status Last Execution of the test definition
Click Execution on the workflow bar. Select the execution definition to which you are assigning test definitions. Select the Assigned Test Definitions tab. All test definitions of the test container which is associated with the selected execution are displayed in the Available Test Definitions window. Click the assign arrow of any test definition that you want to assign to the currently selected execution definition. Clicking the assign arrow of a folder or the top-level container assigns all child test definitions of that parent to the selected execution definition. Click Apply to save the assigned test-definition list. Note: If you do not click Apply, changes you make to the Assigned Test Definitions will be lost.
Related Concepts Test Definition Execution Execution Definition Schedules Related Procedures Managing Test Executions - Quick Start Task Adding Execution Definitions Executing Test Definitions Assign Test Definitions from Grid View to Execution Definitions Using a Filter to Assign Test Definitions to Execution Definitions Related Reference Execution Assigned Test Definitions Tab
225
Test Definition Name Test Definition Status Last Execution of the test definition
To assign one or more test definitions from the test plan Grid View to one or more execution definitions:
1 2 3 4 5 6 7 8
Click Test Plan on the workflow bar. Click Grid View on the toolbar Select the test definitions you want to assign to execution definitions. You can use your keyboard's Ctrl and Shift keys to select multiple test definitions using standard browser multi-select functions. Right-click the selected test definitions and choose Save Selection. Click Execution on the workflow bar. Select the execution definition to which you want to assign the selected test definitions. Choose Assigned Test Definitions. Click Assign Saved Selection. Note: Note: Only test definitions that reside in the execution definitions test container are inserted. You can insert the selected test definitions to more than one execution definitions. You can not insert them into requirements in a different project. The selection persists until you make a different selection or close Test Manager.
Related Concepts Test Definition Execution Execution Definition Schedules Related Procedures Managing Test Executions - Quick Start Task Adding Execution Definitions Executing Test Definitions Using a Filter to Assign Test Definitions to Execution Definitions Manually Assigning Test Definitions to Execution Definitions Related Reference Execution Assigned Test Definitions Tab
226
Test Definition Name Test Definition Status Last Execution of the test definition
Create a filter in the Test Plan unit. Refer to the Creating Filters procedure for details. If the filter already exists, skip this step. Click Execution on the workflow bar. Select the execution definition to which you are assigning test definitions. Select the Assigned Test Definitions tab. All test definitions of the test container which is associated with the selected execution are displayed in the Available Test Definitions window. Select By Filter from the test definition assignment types. Choose the filter from the list box. Click Apply to save the assigned test-definition list. Note: If you do not click Apply, changes you make to the Assigned Test Definitions will be lost.
If you assign test definitions to an execution definition in Test Plan Grid View , the test definition assignment type is automatically set to Manual, but the previously-filtered test definitions remain in the Assigned Test Definitions tab.
Related Concepts Test Definition Execution Execution Definition Schedules Related Procedures Managing Test Executions - Quick Start Task Adding Execution Definitions Executing Test Definitions Creating Filters Assign Test Definitions from Grid View to Execution Definitions Manually Assigning Test Definitions to Execution Definitions Related Reference Execution Assigned Test Definitions Tab
227
Click Execution on the workflow bar. Select an execution definition for which you want to configure a custom schedule. Note: Note: To schedule a folder for execution, select a folder node. To save an edited version of a global schedule as a custom schedule, click Edit while the global schedule is selected in the list box. This enables you to edit the global schedule and save the result as a custom schedule.
3 4 5
Select the Schedule tab. Click the Custom option button to enable the scheduling controls. Click next to the From field and specify when the execution schedule is to begin (Month, Day, Year, Hour, Minute) using the calendar tool. Specify the interval at which the executions tests are to be executed (Day, Hour, Minute). In the Run portion of the GUI, specify when the execution is to end. Select Forever to define a schedule with next to the to field and specify when the execution schedule is to end (Month, Day, no end, or click Year, Hour, Minute) using the calendar tool.
6 7
8 9
(Optional) Click Add Exclusion to define times when scheduled execution definitions should not be executed. Or click Add Definite Run to define times when unscheduled executions should be executed. Click Save to save your custom schedule.
Related Concepts Execution Definition Schedules Test Definition Execution Execution Definitions Related Procedures Managing Test Executions - Quick Start Task Adding Definite Runs Adding Exclusions Adding Execution Definitions Executing Test Definitions Related Reference Execution Schedule tab
228
Click Execution on the workflow bar. Click the execution definition for which you are assigning a setup or cleanup test definition. Click the Setup/Cleanup tab.
To define a setup test definition, proceed with the following step. To define a cleanup test definition, proceed with step 7.
4 5 6 7 8 9
Click Edit in the Setup Test Definition portion of the tab. The Edit Setup Test Definition dialog box displays. Browse through your projects test planning tree and select the test definition that is to serve as this execution definitions setup test definition. Click OK. The assigned test definition then displays in the Setup Test Definition list. Click Edit in the Cleanup Test Definition portion of the tab. The Edit Cleanup Test Definition dialog box displays. Browse through your projects test planning tree and select the test definition that is to serve as this execution definitions cleanup test definition. Click OK. The assigned test definition now displays in the Cleanup Test Definition list.
Related Concepts Setup and Cleanup Test Definitions Execution Definitions Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Executing Test Definitions Related Reference Execution Setup/Cleanup tab
229
Click Execution on the workflow bar. Select the execution definition that will act as the master execution definition. Select the Dependencies tab. Click Add dependent Execution Definition to display the Add dependent Execution Definition dialog box. From the Condition selection list, select the condition that is to trigger the dependent execution definition (Passed, Failed, Not Executed, or Any). The Any status means that the dependent test execution will trigger no matter what the status of the previous test execution. From the tree menu in the dialog box, select the execution definition that is to be dependent. Select one of the following options to specify where the dependent execution definition is to be deployed:
6 7
As specified in the dependent Execution Definition: Automated test definitions assigned to the dependent
execution definition will be executed on the execution server specified for the dependent execution definition on the Deployment tab. Manual test definitions assigned to the dependent execution definition will be assigned to the users specified for the dependent execution definition on the Deployment tab. dependent execution definition will be executed on the execution server specified for the <selected execution definitions execution server> on the Deployment tab. Manual test definitions assigned to the dependent execution definition will be assigned to the users specified for the <selected execution definitions execution server> on the Deployment tab. tester from the list boxes. Automated test definitions assigned to the dependent execution definition will be executed on the specified execution server. Manual test definitions assigned to the dependent execution definition will be assigned to the specified manual tester. If only a specific manual tester is defined and no server, only manual test definitions will be executed. If only a specific execution server is defined and no manual tester, only automated test definitions will be executed.
Same as <selected execution definitions execution server>: Automated test definitions assigned to the
Specific: Execution Server/Manual Tester: Select a pre-configured execution server and/or a manual
Click OK to create the dependency. Note: Note: Test Manager will not allow you to create cyclical execution dependencies. You can select conditions to fulfill for manual test definitions. (Example: If the selected condition is Failed and all manual tests passed, but some automated tests failed, only automated test definitions assigned to the dependent execution definition will be executed).
230
Related Concepts Execution Dependency Configuration Test Definition Execution Execution Definition Schedules Related Procedures Managing Test Executions - Quick Start Task Executing Test Definitions Related Reference Execution Dependencies tab
231
Click Execution on the workflow bar. Select an execution definition in the Execution tree. Select the Deployment tab. Click Edit in the Execution Environment portion of the page. The Assign Keywords dialog box displays. All keywords that have been defined for your execution environment are listed here. Note: For automated execution definitions, the default reserved keywords for each execution server (#<execution name>@<location name>) are included in the list.
Select keywords in the Select keywords list that reflect your execution environment requirements. You can use your keyboard's CTRL and SHIFT keys to select multiple keywords using standard browser multi-select functions. Tip: The Select keywords field is auto-complete enabled. When you enter alphanumeric characters into this field, the field is dynamically updated with an existing keyword that matches the entered characters. Note that this field is disabled when multiple keywords are selected in the Select keywords or Assigned Keywords lists. For automated execution definitions, if you only have a few execution servers and do not require hardware provisioning, you can likely get by using only the default, reserved keywords that are created for each execution server. In such cases, it is not necessary that you select additional keywords.
Tip:
Click Add (>) to move the keyword into the Assigned Keywords list. Note: For automated execution definitions, the execution servers that match the assigned keywords are listed below in the dynamically-updated Matching execution servers list. This list updates each time you add or remove a keyword. Click on the name of an execution server in the list to access the execution servers in Administration Locations.
Click OK to save the keywords and close the Assign Keywords dialog box.
232
Related Concepts VMware Lab Manager Virtual Configurations Execution Definitions Test Definition Execution Execution Definition Schedules Related Procedures Managing Test Executions - Quick Start Task Configuring Deployment Environments Executing Test Definitions Creating New Keywords Removing Keywords from Execution Definitions Related Reference Execution Deployment tab
233
Click Execution on the workflow bar. Select the execution definition that is to be run. Click Run on the toolbar. The Run dialog box displays. Define which test definitions you want to execute. The execution definition is then queued on the specified execution server. Test definitions are executed in the order in which they are listed on the Assigned Test Definitions tab (Execution View). Details of executions can be viewed in the Projects unit, Activities tab. Note: If the execution definition contains manual tests that are still in progress, you will be presented with a list of these tests.
If the execution definition does not contain pending manual tests, the Go To Activities dialog box displays. Click Yes to view the Activities page, or click No if you want to remain on the current Web page. Note: Check the Don't show this dialog again (during this login session) check box if you do not want to be asked about switching to the Activities page again in the future. This setting will be discarded when you log out of Test Manager.
Related Concepts Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Updating Execution Definitions Assigning Keywords to Execution Definitions SilkTest Tests Working with Manual Tests Executing Test Definitions Related Reference Execution Assigned Test Definitions Tab Activities Page Run Dialog
234
Click Execution on the workflow bar. Select an execution definition in the Execution tree view. Select the Runs tab. Click the Run ID of the execution for which you want to see details. Detailed information about the results of the execution definition is displayed.
Related Concepts Test Definition Execution Execution Definition Run Results Dialog Related Procedures Managing Test Executions - Quick Start Task Creating Test Definitions Working with Data-Driven Tests Executing Test Definitions Related Reference Test Definition Run Results Dialog Execution Runs Tab
235
Create test definitions. Test definition configuration varies based on the test type you are creating (for example, automated, manual, data-driven). Creating Test Definitions Edit test definitions. Test definition configuration varies based on the test type you are editing (for example, automated, manual, data-driven). Editing Test Definitions Create test packages. Test packages provide additional details to the user concerning execution runs. Creating a Test Package If you are creating a data-driven test definition, have your system administrator configure a data source, then proceed as explained here. Creating Data-Driven Test Definitions Assign attributes to your test definitions. Assigning Attributes to Test Definitions For SilkPerformer tests, add predefined parameters to your test definitions. Adding Predefined Parameters to Test Definitions Create filters for your test plan. Creating Filters Assign requirements to your test definitions. Assigning Requirements to Test Definitions Attach files to your test definitions. Attaching Files to Test Plan Elements
Related Concepts Test Plan Management Related Procedures Managing Test Plans Working with Attachments Associating Requirements with Test Definitions Working with Data-Driven Tests Editing Test Plan Elements
236
Click Test Plan on the workflow bar. Select a container or folder node in the Test Plan tree where you want to insert a new test definition. Click New Test Definition on the toolbar or right-click within the tree and choose New Test Definition. A new test definition node is appended to the tree view, and the Test Definition dialog box opens. Specify a name and meaningful description for the test definition. Note: Test Manager supports HTML formatting as well as the cutting and pasting of HTML content for text boxes.
Select one of the following test definitions from the Type list box:
SilkTest test SilkPerformer test Manual test SilkTest Multi-testcase import NUnit test Windows scripting test JUnit test SilkTest plan
6
If you are configuring a SilkTest test, proceed to Configuring a SilkTest Test. If you are configuring a SilkPerformer test, proceed to Configuring a SilkPerformer Test. If you are configuring a manual test, proceed to Configuring a Manual Test. If you are configuring a SilkTest multi-testcase import, proceed to Configuring SilkTest Multi-Testcase
Import.
If you are configuring a NUnit test, proceed to Configuring an NUnit Test. If you are configuring a Windows scripting test, proceed to Configuring a Windows Scripting Test. If you are configuring a JUnit test, proceed to Configuring a JUnit Test. If you are configuring a SilkTest plan test, proceed to Configuring a SilkTest plan Test. If you are configuring a .NET Explorer test, proceed to Configuring a .NET Explorer Test.
Note: Test Manager's well-defined public API allows you to implement a proprietary solution that meets your automated test needs. Test Manager is open and extensible to any external tool that can be invoked from a Java implementation or through a command-line call.
237
Note:
Throughout the test-definition configuration process and across all test definition types, Inherit from parent check box options are provided where applicable, enabling you to accept settings of any existing parent entity.
Related Concepts Upload Manager Test Plan Management Test Definition Parameters Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Configuring SilkTest Test Properties Configuring SilkPerformer Test Properties Configuring Manual Test Properties Configuring JUnit Test Properties Configuring SilkTest Plan Properties Configuring NUnit Test Properties Configuring Windows Scripting Test Properties Configuring .Net Explorer Test Properties Editing Test Definitions Related Reference Test Plan Unit Interface APIs HTML Support for Description Text Boxes
238
Click Test Plan on the workflow bar. Select the test definition or the test package that you want to edit. Note: Test Manager supports HTML formatting as well as the cutting and pasting of HTML content for text boxes.
Click Edit on the toolbar or under the General Properties section in the tab view. The Edit Test Definition dialog box displays. Specify the name and description of the selected test definition. If the selected test definition is a test package, the Update Package Structure on Result check box is available. Check the Update Package Structure on Result check box if you want to update the structure of the test package according to the results of the test execution run.
Configure the properties of the test definition or the test package according to the test definition type.
Related Concepts Upload Manager Test Plan Management Test Definition Parameters Test Packages Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Configuring Test Definition Parameters Related Reference Test Plan Unit Interface APIs HTML Support for Description Text Boxes
239
Run the test definition once to create the output.xml file, which contains the structure of the test package. In the Test Plan tree, right-click the name of the test definition and choose Convert to Test Package. The selected test definition is converted to a hierarchy representing the structure of the last execution result.
240
Click Test Plan on the workflow bar. Create a new test definition. See the topic, Creating a Test Definition for information about creating a test definition.
Select the newly created test definition's Properties tab. Scroll down to the Data-driven Properties section of the Properties tab and select the Edit icon to open the Data-driven Properties dialog box. Select a pre-configured data source from the Data Source list box. See SilkCentral Administration Module documentation for information on configuring data sources. Click Next to continue. Select a data set from the Data Set list box (in the case of Excel data sources, this is a worksheet name. In the case of database data sources, this is a table name). Check the Each data row is a single test definition check box to have each row in your data set considered to be a separate test definition, or do not check this check box to create a single test definition for all data rows of your data set. (optional) You can enter a SQL query into the Filter query field to filter your data set based on a SQL-syntax query. Note: Only simple WHERE clause queries are supported.
3 4 5 6
8 9
Check the Enable data-driven properties check box to enable data-driven functionality. Click Finish to save your settings. Note: Note: Data-driven property settings are visible in the lower portion of each test definitions Properties tab. To use Test Manager's data-driven test functionality with SilkPerformer scripts, data sources with column names matching the corresponding SilkPerformer project attributes must be used in conjunction with "AttributeGet" methods.
Related Concepts Manual Tests SilkTest Test Definitions Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Working with Data-Driven Tests Managing Test Plans Related Reference Test Plan Data Set tab
241
Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select the test definition to which you are assigning an attribute. Select the Attributes tab. Click Add Attribute to display the Add Attributes dialog box. Click the plus symbol (+) of the attribute that you are assigning. Based on the attribute type you have selected (set or normal) you will be presented with an Edit Attribute dialog box, which allows you to specify which of the available attribute values youd like to assign to the test definition. Select the value required and click OK to assign the attribute. Note: A Set type attribute allows you to assign a set of values to an attribute. A Normal type attribute allows you to assign only a single value.
Related Concepts Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Generating Test Plans from Requirements View Related Reference Test Plan Attributes tab
242
Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select the test definition node for which you are adding a predefined parameter. Select the Parameters tab. Click Add Predefined Parameter to display the Add Predefined Parameter dialog box, which lists all of the project attributes that are available in the project file. Note: The Add Predefined Parameter button is only available for SilkPerformer test definitions for which the Project property has already been defined.
6 7 8
To add any of the listed parameters, click the corresponding add icon. On the dialog box that displays, specify the actual value for the parameter. Click Save to add the parameter to the active Test Plan tree node.
Related Concepts Test Definition Parameters Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Generating Test Plans from Requirements View Related Reference Test Plan Parameters tab
243
Creating Filters
To create a new custom filter:
1 2 3 4
Click the appropriate button (Execution, Requirements, or Test Plan) on the workflow bar. Click New Filter on the toolbar to display the New Filter dialog box. From the Property list box, select the property on which you wish to base the new filter (for example, Name, Description, Priority, Version and Build). From the Operator list box, select a logical operator to be applied to the specified property (for example, =, not, >, >=, <, <=, contains, anddoes not contains). Note: The contents of the Operator and Value list boxes vary based on the attribute selected in the Property field.
In the Value field, enter the value that the specified property is to be compared against. Note: For date-based properties, the Value field is replaced with a calendar tool that you can use to select a specific date.
6 7 8 9
Click Save and apply to open the Edit Filter dialog box. To apply the filter to the current view without saving the filter settings, click Apply. On the Edit Filter dialog box, enter a name for the filter in the Name field. Enter a meaningful description for the filter in the Filter field. Click OK to save the filter with your project.
Related Concepts Filtering Related Procedures Creating Advanced Filters Creating Global Filters Working with Filters
244
Click Test Plan on the workflow bar. Select the test definition to which you are assigning requirements. In Test Plan View, select the Assigned Requirements tab. All requirements that are available for assignment are displayed in the Available Requirements window. Note: The Available Requirements window can be broadened or narrowed by dragging the window splitter (the left-hand edge of the window) to the left or right.
Click the arrow of any requirement to assign it to the currently selected test definition. Note: Newly generated test definitions can automatically be assigned to the requirements from which they are generated by selecting the Assign newly generated test definitions to requirements on the Generate Test Plans from Requirements dialog box (the default behavior).
Related Concepts Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Working with Data-Driven Tests Managing Test Plans Generating Test Plans from Requirements View Related Reference Test Plan Assigned Requirements tab
245
Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select a container, folder, or test definition. Select the Attachments tab. Click Upload File to open the Upload File dialog box. Using Browse, select a file from your local file system. Enter a meaningful description for the attachment. Click Upload File to upload the attachment to the server and associate it with the selected element.
Note: Attaching files to a test plan element may not work in Mozilla Firefox. Firefox requires usage of three slashes (for example: "file:///") for a file link, while other browsers require only two (for example: "file://"). Aditionally, Firefox includes a security feature blocking links from remote files to local files and directories. For more information, see http://kb.mozillazine.org/Firefox_:_Issues_:_Links_to_Local_Pages_Don't_Work Related Concepts Attachments Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Working with Attachments Managing Test Plans Related Reference Test Plan Attachments tab
246
247
248
249
Settings.
Click Configure Notification to open the Configure Change Notification dialog box. If you do not want to be notified by email when changes are made to requirements in the currently selected project, uncheck the Changes on Requirements check box. If you do not want to be notified by email when changes are made to test plans in the currently selected project, uncheck the Changes on Test Plan check box. Click OK to save the notification settings; or click Cancel to abort the operation without saving changes.
Related Concepts Change Notification Settings Configuration Related Procedures Configuring Projects - Quick Start Task Configuring Test Manager Settings Related Reference Notifications Page
250
Settings.
Click Configure Notification to open the Configure Change Notification dialog box. If you want to be notified by email when changes are made to requirements in the currently selected project, check the Changes on Requirements check box. If you want to be notified by email when changes are made to test plans within the currently selected project, check the Changes on Test Plan check box. Click OK to save the notification settings, or click Cancel to abort the operation without saving changes. You will be notified by email about the changes you have activated.
Related Concepts Change Notification Settings Configuration Related Procedures Configuring Projects - Quick Start Task Configuring Test Manager Settings Related Reference Notifications Page
251
252
Settings.
If you have not already selected a project, a warning message displays, asking you to select a project. Select the project for which you are defining custom attributes.
2 3 4 5 6 7
Select the Attributes tab to view the list of current attributes. Click New Attribute. The New Attribute dialog box displays. Enter a Name for the new attribute. This name will be displayed in list boxes when the attribute becomes available for use. Enter a Description of the new attribute. Select the Attribute type. Depending on the attribute type you have selected, you can now continue as follows: If you have selected the attribute type Edit, you can now click OK to save the new custom attribute, or click Cancel to abort the operation. If you have selected the attribute type Normal or Set, you can define values. To define a new value, click New Value. Enter the value into the Value field on the New Value dialog box and click OK. The new value is then listed in the Value table, where you can edit it by clicking the name of the value; or you can delete it by clicking on the Delete icon. Click OK to save the new attribute; or click Cancel to abort the operation. You will be returned to the Attributes list; the new attribute is now listed.
Related Concepts Attributes Settings Configuration Related Procedures Configuring Projects - Quick Start Task Configuring Test Manager Settings Related Reference Attributes tab
253
Settings.
If you have not already selected a project, a warning message displays, asking you to select a project. Select the project for which you are defining global settings.
2 3 4 5 6
Select the Attributes tab to view the list of current attributes. Before you can delete an attribute, you must first deactivate it. In the Status column, click the Active link or icon and then click Yes on the confirmation dialog box to deactivate the attribute. Once the attribute is inactive, click the delete icon of the attribute to remove it. A confirmation dialog box displays, asking you to confirm the deletion. Click Yes to remove the selected attribute; or click No to abort the operation. If you select Yes you will be returned to the Attributes list, where the removed attribute will no longer be displayed. If an error displays, ensure that the selected attribute is not applied to any test definitions or used in any global filters. You can only delete unused attributes.
Related Concepts Attributes Settings Configuration Related Procedures Configuring Projects - Quick Start Task Configuring Test Manager Settings Related Reference Attributes tab
254
Settings.
If you have not already selected a project, a warning message displays, asking you to select a project. Select the project for which you are defining custom attributes.
2 3 4
Select the Attributes tab to view the list of current attributes. Click the name of the attribute that you are editing. The Edit Attribute dialog box displays. You can edit the Name of the attribute. The name will be displayed in list boxes when the attribute is available for use:
Filters: Attributes can be used in global filters for filtering by test definition attributes (see Global
Filters ).
Test Plan unit: Attributes can be applied to test definitions. (see Understanding Test Definition
Attributes ).
5 6
You can edit the Description of the attribute. Depending on the attribute type, you can continue as follows: If the attribute type is Edit, you can now click OK to save the new custom attribute, or click Cancel to abort the operation. If the attribute type is Normal or Set, you can add, edit or remove values. To define a new value, click New Value. Enter the value into the Value field on the New Value dialog box and click OK. The new value is now listed in the Value table, where you can edit it by clicking the name of the value; or you can delete it by clicking the delete icon.
7 8
Once you are satisfied with your attribute settings, click OK to save the changes; or click Cancel to abort the operation. You will be returned to the Attributes list.
Related Concepts Attributes Settings Configuration Related Procedures Configuring Projects - Quick Start Task Configuring Test Manager Settings Related Reference Attributes tab
255
256
Settings .
Select the Step Properties tab. Click New Property to display the New Custom Step Property dialog box. Enter a name for the new property in the Name field. Note: Custom step property fields are always declared as type string.
Click OK to make your custom property available to all manual test steps in the selected Test Manager project.
Related Concepts Settings Configuration Related Procedures Configuring Projects - Quick Start Task Custom Step Properties Configuring Test Manager Settings Related Reference Step Properties Page
257
Settings .
Select the Step Properties tab. Click the delete icon of the custom property you want to delete. A confirmation dialog box displays, asking you to confirm the deletion. Click Yes to complete the operation, or No to abort.
Related Concepts Settings Configuration Related Procedures Configuring Projects - Quick Start Task Custom Step Properties Configuring Test Manager Settings Related Reference Step Properties Page
258
Settings .
Select the Step Properties tab. Click the name of the custom property that you are editing. The Edit Custom Step Property dialog opens. Edit the name of the property in the Name field. Click OK to save your changes, or click Cancel to abort the operation without saving.
Related Concepts Settings Configuration Related Procedures Configuring Projects - Quick Start Task Custom Step Properties Configuring Test Manager Settings Related Reference Step Properties Page
259
260
Select the Data Sources tab from Test Manager Settings in the menu tree. The Data Sources page displays, listing all of the data sources that have been created for the system. Click New Data Source to open the New Data Source dialog box. Specify a Name for the data source. From the Data source type list box, select JDBC. Note: If you are setting up an ODBC data source, you need to manually insert your ODBC driver class and URL (for example, Driver class: sun.jdbc.odbc.JdbcOdbcDriver, URL: jdbc:odbc:MyDatabaseName). You must also set up an ODBC data source in MS Windows in the Administrative Tools (please refer to Microsoft Windows Help for more information). If you have your front-end server and your application server on different machines, make sure that the name of your system data source set up in Microsoft Windows is the same as the ODBC data source. These names are case-sensitive.
The Driver class field is populated automatically when you select JDBC as the Data source type. In the URL field, replace the host name value (<hostname>) with the name of the computer that is hosting the data source and replace the database name value (<databasename>) with the name of the target database.
6 7
In the Username and Password fields, enter valid database credentials. (Optional) If you are working with a database that includes multiple tables, and you want to narrow down the data source to specific tables, you can browse to and select specific tables for inclusion:
1 2 3
Click [...] next to the Table filter field. The Select Table Filter dialog box displays. Select the tables that you want included as your data source. Click OK.
(Optional) Key column selection is used by test definitions to define which worksheet columns within a data source are used as primary key. This is helpful if your data source will undergo edits (for example, adding or removing rows within a worksheet). Even if your data source is edited, test definitions will still be able to identify which columns/rows should be used. Test definitions created from data-driven data sources use key column values in their names, rather than column numbers. To configure a key column:
1 2 3
Click [...] next to the Key column field. The Select Key Column dialog box displays. Select a column from the column list that is to act as a key column. Click OK.
261
Related Concepts Data Sources for Data-Driven Tests Related Procedures Configuring Microsoft Excel or CSV Data Sources Synchronizing Data Sources Uploading Updated Excel Files to a Data Source Downloading Excel Files from a Data Source Deleting Data Sources Related Reference Data Sources Configuration Page
262
Select the Data Sources tab from Test Manager Settings in the menu tree. The Data Sources page displays, listing all of the data sources that have been created for the system. Click New Data Source to open the New Data Source dialog box. Specify a Name for the data source. From the Data source type list box, select MS Excel to configure a Microsoft Excel data source, or select CSV to configure a CSV data source. From the Source control profile list box, select the pre-configured source control profile that hosts your data file. See the related Source Control Profiles topic for detailed information regarding the configuration of source control profiles. Click Browse to open the Select Source Control Path dialog box. Browse to and select a data source file of the selected type in your source control path. MS Excel only: (Optional) If you are working with an Excel spreadsheet that includes multiple worksheets, and you want to narrow down the data source to specific worksheets, you can browse to and select specific worksheets for inclusion. To do this:
1 2 3
6 7
Click [...] next to the Worksheet filter field. The Select Worksheet Filter dialog box displays. Select the worksheets that you want included as your data source. Click OK.
(Optional) Key column selection is used by test definitions to define which worksheet columns within a data source are used as primary key. This is helpful if your data source will undergo edits (for example, adding or removing rows within a worksheet). Even if your data source is edited, test definitions will still be able to identify which columns/rows should be used. Test definitions created from data-driven data sources use key column values in their names, rather than column numbers. Note: MS Excel only: If the data source includes multiple worksheets, only columns with identical names are available to be defined as key columns.
Click [...] next to the Key column field. The Select Key Column dialog box displays. Select a column from the column list that is to act as a key column. Click OK.
263
Related Concepts Data Sources for Data-Driven Tests Related Procedures Configuring JDBC Data Sources Synchronizing Data Sources Uploading Updated Excel Files to a Data Source Downloading Excel Files from a Data Source Deleting Data Sources Related Reference Data Sources Configuration Page
264
Select the Data Sources tab from Test Manager Settings in the menu tree. The Data Sources page displays, listing all of the data sources that have been created for the system. Click the Delete icon in the Actions column that corresponds to your data source. A confirmation dialog box displays. Click Yes to remove the data source, or click No to abort the deletion.
Related Concepts Data Sources for Data-Driven Tests Related Procedures Configuring Microsoft Excel or CSV Data Sources Configuring JDBC Data Sources Synchronizing Data Sources Uploading Updated Excel Files to a Data Source Downloading Excel Files from a Data Source Related Reference Data Sources Configuration Page
265
Select the Data Sources tab from Test Manager Settings in the menu tree. The Data Sources page displays, listing all of the data sources that have been created for the system. Click the Download icon in the Actions column that corresponds to your data source. The File Download dialog box displays. Click Open to open the file immediately, or click Save to specify where on your local system you want to save the file to.
Related Concepts Data Sources for Data-Driven Tests Related Procedures Configuring Microsoft Excel or CSV Data Sources Configuring JDBC Data Sources Synchronizing Data Sources Uploading Updated Excel Files to a Data Source Deleting Data Sources Related Reference Data Sources Configuration Page
266
Select the Data Sources tab from Test Manager Settings in the menu tree. The Data Sources page displays, listing all of the data sources that have been created for the system. Click the Synchronize icon in the Actions column that corresponds to your data source to propagate the updated file to the associated test definitions. A confirmation dialog box displays, asking you to confirm the synchronization. Click Yes to synchronize all test definitions with the updated data source, or click No to abort the synchronization. Warning: All running executions depending on this data source will be aborted. Results of incomplete test definitions within these executions will get lost.
Related Concepts Data Sources for Data-Driven Tests Related Procedures Configuring Microsoft Excel or CSV Data Sources Configuring JDBC Data Sources Uploading Updated Excel Files to a Data Source Downloading Excel Files from a Data Source Deleting Data Sources Related Reference Data Sources Configuration Page
267
Select the Data Sources tab from Test Manager Settings in the menu tree. The Data Sources page displays, listing all of the data sources that have been created for the system. Click the Upload icon in the Actions column that corresponds to your data source. Click Browse... on the Upload File dialog box. Browse to and select the updated Excel file that is to replace the currently uploaded Excel file. Click Open. Click OK on the Upload File dialog box. A confirmation dialog box displays, asking you to confirm the overwriting of the existing file. Click Yes to continue. After uploading the updated data source file, another dialog box displays, asking you if you want to synchronize the test definitions with the updated data source. Click Yes to synchronize immediately, or click No if you want to synchronize later. Note: After uploading an updated data source file, you must synchronize the data source so that associated test definitions are updated. See the related Synchronizing data sources procedure for details.
Related Concepts Data Sources for Data-Driven Tests Related Procedures Configuring Microsoft Excel or CSV Data Sources Configuring JDBC Data Sources Synchronizing Data Sources Downloading Excel Files from a Data Source Deleting Data Sources Related Reference Data Sources Configuration Page
268
269
Settings .
If you have not already selected a project, a warning message will appear, asking you to select a project. Select the project for which you are defining global settings.
2 3 4 5
Select the Filters tab to view the list of available filters. Click New Filter. The New Filter dialog box displays. Enter a Name for the new filter. This name will be displayed in list boxes when the filter becomes available. Select a Category for the new filter from the list box to make the filter available in a specific Test Manager unit:
Requirement Filter The filter will be available in the Requirements Management unit. Test Definition Filter The filter will be available in the Test Plan Management unit. Execution The filter will be available in the Test Execution Management unit.
6 7
Enter a Description of the new filter. Select a category of filter criteria (Selection criteria). The available categories depend on the general filter category you have selected. You can also combine filters by selecting Nested Test Definition Filter or Nested Requirements Filter. Selecting one of these categories allows you to include an existing test definition filter (for example, an existing requirements filter) in your new filter.
Select a Property, Operator, and Value for the new filter from the respective list boxes.
Property Available properties depend on the filter category that you have selected in the previous step.
It defines the property for which you are defining a filter setting. If you have selected an attribute category, the property list includes custom attributes to query against.
Operator Specifies the filter operator. The operator depends on the property type you have selected.
For example, if you have selected a property that is based on a string field type, the available operators are = (equals defined value) not (differs from the defined value) contains (contains the defined value somewhere in the string) not contains (does not contain the defined value in the string) values will either be strings that you can enter into the text box, or a selection of predefined values that you can select from the list box.
Value Enter the value that you want to filter out. Depending on the property type that you have selected,
Click More if you want to add more than one filter category to the new filter. Repeat this procedure to define new categories. If you define more than one filter category, you must define whether the categories need to be fulfilled in addition to the existing categories (AND relationship), or if the filter returns true when any of the filter categories are fulfilled (OR relationship). Select either AND or OR to define the filter category relationship. You cant define nested AND, OR relationships. To remove filter categories, click Fewer. This removes the last filter category. When you are done, click OK to save the new filter, or click Cancel to abort the operation.
270
Related Concepts Global Filters Settings Configuration Related Procedures Configuring Projects - Quick Start Task Configuring Test Manager Settings Creating Filters Related Reference Filters tab
271
Settings .
If you have not already selected a project, a warning message displays, asking you to select a project. Select the project for which you are defining global settings.
2 3 4
Select the Filters tab to view the list of current filters. Click the delete icon of the filter that you want to remove. A confirmation dialog box displays, asking you you to confirm the deletion. Click Yes to remove the selected filter; or No to abort the operation. If you select Yes, you will be returned to the filters list; the removed filter will no longer be displayed.
Related Concepts Global Filters Settings Configuration Related Procedures Configuring Projects - Quick Start Task Configuring Test Manager Settings Deleting Filters Related Reference Filters tab
272
Settings .
If you have not selected a project, a warning message will appear, asking you to select a project. Select the project for which you want to define global settings.
2 3 4
Select the Filters tab to view the list of current filters. Click the name of the filter you are editing. The Edit Filter dialog box displays. You can edit the following filter properties:
Name of the filter This name will be displayed in list boxes when the filter is available Description of the filter This provides a meaningful way to identify what the filter does Categories for filter criteria. You can change, add, or remove categories for filter criteria. The available
categories depend on the general category of filters.
5
You can also combine filters by selecting Nested Test Definition Filter or Nested Requirements Filter in the Selection Criteria categories. Selecting either of those categories allows you to include an existing test definition filter (that is, an existing requirements filter) in your new filter. Select a Property, Operator, and Value for the filter from the respective list boxes.
Property Available properties depend on the filter category you have selected in the previous step. It
defines the property you want to define a filter setting for. If you have selected an attribute category, the property list includes custom attributes to query against. See Attributes for detailed information about defining custom attributes. selected. For example, if you have selected a property that is based on a string field type, the available operators are: = (equals defined value) ) not (differs from the defined value) contains (contains the defined value somewhere in the string) not contains (does not contain the defined value in the string)
Operator Select the filter operator. The available operators depend on the property type that you have
Value Enter the value that you want to filter out. Depending on the property type that you have selected,
values will either be strings that you can enter into the text box, or they will be a selection of predefined values that you can select from the list box.
Click More if you want to add more than one filter category to the filter. Proceed by defining new categories. If you define more than one filter category, you must define whether the categories need to be fulfilled in addition to the existing categories (AND relationship), or if the filter returns true when any of the filter categories are fulfilled (OR relationship). Select eitherAND or OR to define the filter category relationship. Note: Note: If you define more than two filter categories, selecting AND (or OR) defines the relationship between all categories. You cannot define nested AND, OR relationships. To remove filter categories, click Fewer. This removes the last filter category.
273
Related Concepts Global Filters Settings Configuration Related Procedures Configuring Projects - Quick Start Task Configuring Test Manager Settings Editing Filters Related Reference Filters tab
274
275
Select the Issue Tracking tab from Administration Settings in the menu tree. The Issue Tracking page displays, listing all of the issue tracking profiles that have been created for the system. Click the Delete icon of the issue tracking profile you want to delete. A confirmation dialog box displays. Click Yes to remove the selected profile. You are returned to the Issue Tracking page.
Related Concepts Issue Tracking Profiles Related Reference Issue Tracking Profiles Page
276
In This Section
1
Adding a new Issue Manager issue tracking profile: Adding SilkCentral Issue Manager Issue Tracking Profiles Mapping the existing issue states of Issue Manager to the states of Test Manager: Mapping Issue States Editing an existing Issue Manager issue tracking profile: Editing SilkCentral Issue Manager Issue Tracking Profiles Deleting an existingIssue Manager issue tracking profile: Deleting Issue Tracking Profiles
Related Concepts Issue Tracking Profiles Related Reference Issue Tracking Profiles Page
277
To launch the New Issue Tracking Profile dialog: Select Issue Manager from the Type list box, or select Issue Manager 3.3 to connect to an Issue Manager version 3.3 installation. Type a valid Issue Manager Username and Password. These credentials will be used to access your Issue Manager system. Type the Issue Manager URL of your Issue Manager installation. This is the URL you use to login to Issue Manager, though without the login extension at the end. For example, if your standard Issue Manager URL is http://IssueManager/login, then the correct service URL is http://IssueManager.
2 3 4
If you selected Issue Manager 3.3 from the Type list box, proceed with the next step. If you selected Issue Manager from the Type list box, proceed as follows:
1
Click Load Projects. This action will populate the Project list box with all initialized Issue Manager projects to which the specified user has access to. Note that only those projects display for which Issue Manager user groups have been defined, and the defined user is a member of at least one user group. Select the Project where Issue Manager issues are maintained. Warning: Borland recommends not to use identical projects for Issue Manager and Test Manager, as this limits flexibility in working with both tools on different future projects.
Click OK. Test Manager attempts a trial connection to Rational ClearQuest using the information you have provided. Note: If an error occurs, please review the login credentials and the service URL that you have supplied, or consult your Issue Manager administrator.
If the trial connection to Issue Manager is successful, a confirmation dialog box displays, asking you if you want to map internal issue states to the states of the newly defined profile.
Click Yes to proceed with the related Mapping Issue States procedure. Click No to map issue states later.
278
Related Concepts Issue Tracking Profiles Related Procedures Managing SilkCentral Issue Manager Issue Tracking Profiles Mapping Issue States Related Reference Issue Tracking Profiles Page
279
Click the Issue Tracking tab in Test Manager Settings . The Issue Tracking page opens, listing all of the issue tracking profiles that have been created for the system. Click the Edit state mapping icon of the issue tracking profile you want to edit. The Edit Status Mapping dialog box opens, listing all existing issue states of the external issue tracking software. These states are listed in the External column. The internal issue states of Test Manager are available in the list boxes in the Internal column. Map internal issue states to corresponding external issue states by selecting the respective entries from the list boxes. Once you have mapped each external issue state to an internal state, click OK to save your settings, or click Cancel to abort the operation.
Related Concepts Issue Tracking Profiles Related Reference Issue Tracking Profiles Page
280
To launch the Edit Issue Tracking Profile dialog: Edit the Name of the profile. This is the name that will be displayed in issue-tracking profile lists. Edit the Description of the profile. Edit theIssue Manager Username and Password. These credentials are used to access your Issue Manager system. Edit the Issue Manager URL of your Issue Manager installation if the location has changed. This is the URL you use to login to Issue Manager, though without the login extension at the end. Example: If your standard Issue Manager URL is http://IssueManager/login, then the correct service URL would be http://IssueManager.
2 3 4 5
Click OK. Test Manager attempts a trial connection to SilkCentral Issue Manager using the information you have provided. Note: If an error occurs, review the login credentials and the repository info that you have supplied, or consult your SilkCentral Issue Manager administrator.
If the trial connection to SilkCentral Issue Manager is successful, you are returned to the Issue Tracking page.
Related Concepts Issue Tracking Profiles Related Procedures Managing SilkCentral Issue Manager Issue Tracking Profiles Mapping Issue States Related Reference Issue Tracking Profiles Page
281
Select the Issue Tracking tab from Administration Settings in the menu tree. The Issue Tracking page displays, listing all of the issue tracking profiles that have been created for the system. Click the Delete icon of the issue tracking profile you want to delete. A confirmation dialog box displays. Click Yes to remove the selected profile. You are returned to the Issue Tracking page.
Related Concepts Issue Tracking Profiles Related Reference Issue Tracking Profiles Page
282
In This Section
1
Adding a new StarTeam issue tracking profile: Adding Borland StarTeam Issue Tracking Profiles Mapping the existing issue states of StarTeam to the states of Test Manager: Mapping Issue States Editing an existing StarTeam issue tracking profile: Editing Borland StarTeam Issue Tracking Profiles Deleting an existingStarTeam issue tracking profile: Deleting Issue Tracking Profiles
Related Concepts Issue Tracking Profiles Related Reference Issue Tracking Profiles Page
283
To launch the New Issue Tracking Profile dialog: Select StarTeam from the Type list box. Type a valid StarTeam Username and Password. These credentials are used to retrieve the status of existing StarTeam change requests and information required for entering new issues. Type the Hostname of your StarTeam server and the Port that is used to connect to the StarTeam server. If this setting has not been changed, use the default port 49201. Specify the type of Encryption that the profile supports. Click Load Project to load all projects from the server and populate the Project list box, then select a project from the Project list box. Click Load View to load all views for the selected project and populate the View list box, then select a view from the View list box. Click Load Status Field to load all enumeration fields for change requests and populate the Status Field list box, then select a status field from the Status Field list box. If you are using a custom workflow in StarTeam, this field is the workflow driver field in StarTeam that maps to the Test Manager issue state. Click OK. Test Manager attempts a trial connection to Borland StarTeam using the information you have provided. Note: If an error occurs, please review the login credentials and other StarTeam information you have supplied, or consult your StarTeam administrator.
2 3 4 5 6 7 8
10 If the trial connection to StarTeam is successful, a confirmation dialog box displays, asking you if you want to
map internal issue states to the states of the newly defined profile.
Click Yes to proceed with the related Mapping Issue States procedure. Click No to map issue states later.
Related Concepts Issue Tracking Profiles Related Procedures Managing Borland StarTeam Issue Tracking Profiles Mapping Issue States Related Reference Issue Tracking Profiles Page
284
Click the Issue Tracking tab in Test Manager Settings . The Issue Tracking page opens, listing all of the issue tracking profiles that have been created for the system. Click the Edit state mapping icon of the issue tracking profile you want to edit. The Edit Status Mapping dialog box opens, listing all existing issue states of the external issue tracking software. These states are listed in the External column. The internal issue states of Test Manager are available in the list boxes in the Internal column. Map internal issue states to corresponding external issue states by selecting the respective entries from the list boxes. Once you have mapped each external issue state to an internal state, click OK to save your settings, or click Cancel to abort the operation.
Related Concepts Issue Tracking Profiles Related Reference Issue Tracking Profiles Page
285
Edit the Name and the Description of the profile. Edit theStarTeam Username and Password. These credentials are used to access your StarTeam system. 3
2 4 5 6
Edit the Hostname of your StarTeam server and the Port that is used to connect to the StarTeam server. Modify the type of Encryption that the profile supports. To change the StarTeam project, click Load Project to load all projects from the server and update the Project list box, then select a project from the Project list box. Note: Reload the View list box to display the updated list of views when you change a project. Click Load View to load all views for the selected project and populate the View list box, then select a view from the View list box.
To change the workflow driver field, click Load Status Field to load all enumeration fields for change requests and populate the Status Field list box, then select a status field from the Status Field list box. If you are using a custom workflow in StarTeam, this field is the workflow driver field in StarTeam that maps to the Test Manager issue state. Click OK. Test Manager attempts a trial connection to Borland StarTeam using the information you have provided. Note: If an error occurs, review the login credentials and the repository info that you have supplied, or consult your Borland StarTeam administrator.
If the trial connection to Borland StarTeam is successful, you are returned to the Issue Tracking page.
Related Concepts Issue Tracking Profiles Related Procedures Managing Borland StarTeam Issue Tracking Profiles Mapping Issue States Related Reference Issue Tracking Profiles Page
286
Select the Issue Tracking tab from Administration Settings in the menu tree. The Issue Tracking page displays, listing all of the issue tracking profiles that have been created for the system. Click the Delete icon of the issue tracking profile you want to delete. A confirmation dialog box displays. Click Yes to remove the selected profile. You are returned to the Issue Tracking page.
Related Concepts Issue Tracking Profiles Related Reference Issue Tracking Profiles Page
287
Adding a new Bugzilla issue tracking profile: Adding Bugzilla Issue Tracking Profiles Mapping the existing issue states of Bugzilla to the states of Test Manager: Mapping Issue States Editing an existing Bugzilla issue tracking profile: Editing Bugzilla Issue Tracking Profiles Deleting an existing Bugzilla issue tracking profile: Deleting Issue Tracking Profiles
Related Concepts Issue Tracking Profiles Related Reference Issue Tracking Profiles Page
288
To launch the New Issue Tracking Profile dialog: Select Bugzilla from the Type list box. Type a valid Bugzilla Username and Password. These credentials are used to retrieve the status of existing StarTeam change requests and information required for entering new issues. Enter the URL of your Bugzilla installation. For example, http://bugzillaserver/cgi-bin/ bugzilla/. Note: To establish a connection to Bugzilla, the URL must end with a slash (/).
2 3 4
Click OK. Test Manager attempts a trial connection to Bugzilla using the information you have provided. Note: If an error occurs, please review the login credentials and other Bugzilla information you have supplied, or consult your StarTeam administrator.
If the trial connection to Bugzilla is successful, a confirmation dialog box displays, asking you if you want to map internal issue states to the states of the newly defined profile.
Click Yes to proceed with the related Mapping Issue States procedure. Click No to map issue states later.
Note: Mapping the existing issue states of Bugzilla to the states of Test Manager enables Test Manager to list issues correctly when querying internal and external issues.
Related Concepts Issue Tracking Profiles Related Procedures Managing Bugzilla Issue Tracking Profiles Mapping Issue States Related Reference Issue Tracking Profiles Page
289
Click the Issue Tracking tab in Test Manager Settings . The Issue Tracking page opens, listing all of the issue tracking profiles that have been created for the system. Click the Edit state mapping icon of the issue tracking profile you want to edit. The Edit Status Mapping dialog box opens, listing all existing issue states of the external issue tracking software. These states are listed in the External column. The internal issue states of Test Manager are available in the list boxes in the Internal column. Map internal issue states to corresponding external issue states by selecting the respective entries from the list boxes. Once you have mapped each external issue state to an internal state, click OK to save your settings, or click Cancel to abort the operation.
Related Concepts Issue Tracking Profiles Related Reference Issue Tracking Profiles Page
290
To open the Edit Issue Tracking Profile dialog box: Edit the Name and the Description of the profile. Edit the Bugzilla Username and Password. These credentials are used to access your Bugzilla system. Edit the URL of your Bugzilla installation. Click OK. Test Manager attempts a trial connection to Bugzilla using the information you have provided. Note: If an error occurs, review the login credentials and the repository info that you have supplied, or consult your Bugzilla administrator.
2 3 4 5
If the trial connection to Bugzilla is successful, you are returned to the Issue Tracking page.
Related Concepts Issue Tracking Profiles Related Procedures Managing Bugzilla Issue Tracking Profiles Mapping Issue States Related Reference Issue Tracking Profiles Page
291
Select the Issue Tracking tab from Administration Settings in the menu tree. The Issue Tracking page displays, listing all of the issue tracking profiles that have been created for the system. Click the Delete icon of the issue tracking profile you want to delete. A confirmation dialog box displays. Click Yes to remove the selected profile. You are returned to the Issue Tracking page.
Related Concepts Issue Tracking Profiles Related Reference Issue Tracking Profiles Page
292
In This Section
1
Adding a new IBM Rational ClearQuest issue tracking profile: Adding IBM Rational ClearQuest Issue Tracking Profiles Mapping the existing issue states of IBM Rational ClearQuest to the states of Test Manager: Mapping Issue States Editing an existing IBM Rational ClearQuest issue tracking profile: Editing IBM Rational ClearQuest Issue Tracking Profiles Deleting an existingIBM Rational ClearQuest issue tracking profile: Deleting Issue Tracking Profiles
Related Concepts Issue Tracking Profiles Related Reference Issue Tracking Profiles Page
293
To launch the New Issue Tracking Profile dialog: Select IBM Rational ClearQuest from the Type list box. Type a valid Rational ClearQuest Username and Password. These credentials will be used to access your IBM Rational ClearQuest system. Enter the Repository Info of your Rational ClearQuest installation. This is the database name that is defined in the Rational ClearQuest client software. Specify the Record Type (the issue type of Rational ClearQuest). When entering an issue in Test Manager, Rational ClearQuest will save the issue with the issue type you define in this text box. The default issue type is Defect. Click OK. Test Manager attempts a trial connection to Rational ClearQuest using the information you have provided. Note: If an error occurs, review the login credentials and the repository info that you have supplied, or consult your Rational ClearQuest administrator.
2 3 4 5
If the trial connection to Rational ClearQuest is successful, a confirmation dialog box displays, asking you if you want to map internal issue states to the states of the newly defined profile.
Click Yes to proceed with the related Mapping Issue States procedure. Click No to map issue states later.
Related Concepts Issue Tracking Profiles Related Procedures Managing IBM Rational ClearQuest Issue Tracking Profiles Mapping Issue States Related Reference Issue Tracking Profiles Page
294
Click the Issue Tracking tab in Test Manager Settings . The Issue Tracking page opens, listing all of the issue tracking profiles that have been created for the system. Click the Edit state mapping icon of the issue tracking profile you want to edit. The Edit Status Mapping dialog box opens, listing all existing issue states of the external issue tracking software. These states are listed in the External column. The internal issue states of Test Manager are available in the list boxes in the Internal column. Map internal issue states to corresponding external issue states by selecting the respective entries from the list boxes. Once you have mapped each external issue state to an internal state, click OK to save your settings, or click Cancel to abort the operation.
Related Concepts Issue Tracking Profiles Related Reference Issue Tracking Profiles Page
295
To launch the Edit Issue Tracking Profile dialog: Edit the Name of the profile. This is the name that is displayed in issue-tracking profile lists. Edit the Description of the profile. Edit the Rational ClearQuest Username and Password. These credentials are used to access your IBM Rational ClearQuest system. Edit the Repository Info of your Rational ClearQuest installation. This is the database name that is defined in the Rational ClearQuest client software. Change the Record Type (the issue type of Rational ClearQuest). When entering an issue in Test Manager, Rational ClearQuest saves the issue with the issue type you define in this field. Click OK. Test Manager attempts a trial connection to Rational ClearQuest using the information you have provided. Note: If an error occurs, review the login credentials and the repository info that you have supplied, or consult your Rational ClearQuest administrator.
2 3 4 5 6 7
If the trial connection to Rational ClearQuest is successful, you are returned to the Issue Tracking page.
Related Concepts Issue Tracking Profiles Related Procedures Managing IBM Rational ClearQuest Issue Tracking Profiles Mapping Issue States Related Reference Issue Tracking Profiles Page
296
Select the Issue Tracking tab from Administration Settings in the menu tree. The Issue Tracking page displays, listing all of the issue tracking profiles that have been created for the system. Click the Delete icon of the issue tracking profile you want to delete. A confirmation dialog box displays. Click Yes to remove the selected profile. You are returned to the Issue Tracking page.
Related Concepts Issue Tracking Profiles Related Reference Issue Tracking Profiles Page
297
298
Choose the Source Control tab from Test Manager Settings in the menu tree. The Source Control page displays, listing all of the source control profiles that have been created for the system. Click the Delete icon of the source control profile you wish to delete. A confirmation dialog box displays. Click Yes.
You are returned to the Source Control page. Related Concepts Source Control Profiles Related Reference Source Control Profiles Page
299
In This Section
1
Adding a new StarTeam profile: Adding StarTeam Source Control Profiles Editing an existing StarTeam profile: Editing StarTeam Source Control Profiles Deleting a StarTeam profile: Deleting Source Control Profiles
Related Concepts Source Control Profiles Related Reference Source Control Profiles Page
300
To open the New Source Control Profile dialog box: Select StarTeam from the Source control system list box. Type the Hostname of your StarTeam server. Type the port that is to be used to connect to the StarTeam server. If the port is not changed, use the default port 49201.
2 3
4 5 6
Type a valid StarTeam Username and Password. Specify if the profile supports Encryption. Type the Working folder to which the Test Manager execution server should copy the source files. The working folder must be a local path. Type the Project path you want this profile to use. Alternative: Click Browse next to the Project path text box to connect to the StarTeam system that uses the credentials you have entered.
Click OK.
Note: If an error occurs, review the repository path and the StarTeam login credentials you have supplied. Or contact your StarTeam administrator. Test Manager attempts a trial connection to StarTeam using the information you have provided. If the trial connection to StarTeam is successful, you are returned to the Source Control page. Related Concepts Source Control Profiles Related Procedures Managing Borland StarTeam Source Control Profiles Editing StarTeam Source Control Profiles Deleting Source Control Profiles Related Reference Source Control Profiles Page
301
To open the New Source Control Profile dialog box: Choose from the following options:
Edit the Name of the profile. This is the name that will be displayed in the Test Manager GUI. Edit the Hostname of your StarTeam server. Edit the port that is to be used to connect to the StarTeam server. If the port is not changed, use the default
port 49201.
Edit the StarTeam Username and Password. Specify if the profile supports Encryption. Edit the Working folder to which the Test Manager execution server is to copy the source files as required.
The working folder must be a local path.
Click OK.
Note: If an error occurs, review the repository path and the StarTeam login credentials you have supplied. Or contact your StarTeam administrator. Test Manager attempts a trial connection to StarTeam using the information you have provided. If the trial connection to StarTeam is successful, you are returned to the Source Control page. Related Concepts Source Control Profiles Related Procedures Managing Borland StarTeam Source Control Profiles Adding StarTeam Source Control Profiles Deleting Source Control Profiles Related Reference Source Control Profiles Page
302
Choose the Source Control tab from Test Manager Settings in the menu tree. The Source Control page displays, listing all of the source control profiles that have been created for the system. Click the Delete icon of the source control profile you wish to delete. A confirmation dialog box displays. Click Yes.
You are returned to the Source Control page. Related Concepts Source Control Profiles Related Reference Source Control Profiles Page
303
In This Section
1
Adding a new PVCS profile: Adding PVCS Source Control Profiles Editing an existing PVCS profile: Editing PVCS Source Control Profiles Deleting a PVCS profile: Deleting Source Control Profiles
Related Concepts Source Control Profiles Related Reference Source Control Profiles Page
304
To open the New Source Control Profile dialog box: Select PVCS from the Source control system list box. Type the UNC path of the PVCS Repository you want to access. If you do not know the UNC path of the repository, consult your PVCS administrator. Type a valid UNC username and UNC password. These credentials are required to access the UNC path of your repository. Type the Working folder to which the Test Manager execution server should copy the source files. The working folder must be a local path. for example C:\TempSources\. Type the Execution path. This is the local path of the PVCS installation, where the command line tool pcli.exe is located. The default path is C:\Program Files\Serena\vm\win32\bin. Note: Note: The PVCS client software must be installed on the front-end server and each execution server. PVCS must be installed in identical paths on each machine. For example, if you install PVCS on the Test Manager front-end server to C:\Program Files\Serena\, you must install PVCS in the same path on the execution servers.
2 3 4 5 6
7 8
Type a valid PVCS Username and Password. These credentials will be used to access your PVCS repository. Type the Project path you want this profile to use. Alternative: Click Browse next to the Project path text box to connect to the PVCS system that uses the credentials you have entered. The Select Project Path dialog box opens. Select the desired project path in the tree view and click OK. Leaving this text box empty sets the project path to the root directory.
Click OK.
Note: If an error occurs, review the UNC repository path, the UNC login credentials, the execution path info, and the PVCS login credentials you have supplied. Or contact your PVCS administrator. Test Manager attempts a trial connection to PVCS using the information you have provided. If the trial connection to PVCS is successful, you are returned to the Source Control page.
305
Related Concepts Source Control Profiles Related Procedures Managing Serena Version Manager (PVCS) Profiles Editing PVCS Source Control Profiles Deleting Source Control Profiles Related Reference Source Control Profiles Page
306
To launch the New Source Control Profile dialog: Choose from the following options:
Edit the Name of the profile . This is the name that will be displayed in the Test Manager GUI. Edit the UNC path of the PVCS Repository. If you do not know the UNC path of the repository, please
consult your PVCS administrator.
Edit theUNC username and UNC password as required. These credentials are required to access the
repository UNC path you specified above.
Edit the Working folder to which the Test Manager execution server should copy the source files. The
working folder must be a local path. for example, C:\TempSources\.
3
Edit the Execution path. This is the local path of the PVCS installation, where the command line tool pcli.exe is located. The default path is C:\Program Files\Merant\vm\win32\bin. Note: Note: The PVCS client software must be installed on the front-end server and each execution server. PVCS must be installed in identical paths on each machine. For example. if you install PVCS on the Test Manager front-end server to C:\Program Files\Merant\, you must install PVCS in the same path on the execution servers.
4 5 6
Edit the PVCS Username and Password. These credentials will be used to access your PVCS repository. Edit the Project path. Click OK.
Note: If an error occurs, review the path to the UNC, the UNC login credentials, the PVCS login credentials, and the execution path info that you have supplied. Or contact your PVCS administrator. Test Manager attempts a trial connection to PVCS using the information you have provided. If the trial connection to PVCS is successful, you are returned to the Source Control page. Related Concepts Source Control Profiles Related Procedures Managing Serena Version Manager (PVCS) Profiles Adding PVCS Source Control Profiles Deleting Source Control Profiles Related Reference Source Control Profiles Page
307
Choose the Source Control tab from Test Manager Settings in the menu tree. The Source Control page displays, listing all of the source control profiles that have been created for the system. Click the Delete icon of the source control profile you wish to delete. A confirmation dialog box displays. Click Yes.
You are returned to the Source Control page. Related Concepts Source Control Profiles Related Reference Source Control Profiles Page
308
In This Section
1
Adding a new CVS profile: Adding CVS Source Control Profiles Editing an existing CVS profile: Editing CVS Source Control Profiles Deleting a CVS profile: Deleting Source Control Profiles
Related Concepts Source Control Profiles Related Reference Source Control Profiles Page
309
To open the New Source Control Profile dialog box: Choose CVS from the Source control system list box. Type the CVS server name or IP address in the Hostname text box. Type the port that is to be connected to in the Port text box. Specify the connection method in the Method text box. Currently, the ext, pserver, and local connection methods are supported. This makes the Port setting optional. Specify the URL of the CVS Repository you want to access. For example, /var/lib/cvs. If you do not know the URL of the repository, please consult your CVS administrator. Type a valid CVS Username and Password. These credentials will be used to access your CVS repository. Note that these settings are optional when using the ext connection method. Specify the CVS Module that is to be used, then enter the Working folder to which the Test Manager execution server should copy the source files. The working folder must be a local path, e.g., C:\TempSources\. Optional: Type the Project path that you want this profile to use. If the connection is successful, the Select Project Path dialog box will display. Leaving this field empty sets the project path to the root directory. Alternative: Click Browse next to the Project path text box to connect to the CVS system that uses the credentials you have entered. The Select Project Path dialog box opens. Select the desired project path in the tree view and click OK. Leaving this text box empty sets the project path to the root directory.
2 3
4 5 6 7 8
Click OK. Note: If an error occurs, review the repository path and the CVS login credentials you have supplied. Or contact your CVS administrator.
Test Manager attempts a trial connection to CVS using the information you have provided. If the trial connection to CVS is successful, you are returned to the Source Control page.
Related Concepts Source Control Profiles Related Procedures Managing CVS Profiles Editing CVS Source Control Profiles Deleting Source Control Profiles Related Reference Source Control Profiles Page
310
To open the New Source Control Profile dialog box: Choose from the following options:
Edit the Name of the profile. This is the name that is displayed in the Test Manager GUI. Edit the CVS server name or IP address in the Hostname text box. Edit the port that is to be connected to in the Port text box. Edit the connection method in the Method text box. Currently, the ext, pserver, and local connection
methods are supported. This makes the Port setting optional.
Edit the URL of the CVS Repository you want to access. If you do not know the URL of the repository,
consult your CVS administrator.
Edit your CVS Username and Password. Edit the CVS Module that is to be used. Edit the Working folder to which the Test Manager execution server should copy the source files. The
working folder must be a local path. For example, C:\TempSources\.
Edit the Project path that you want this profile to use
Note:
3
The CVS Username and Password are optional when using the ext connection method.
Click OK. Note: If an error occurs, review the repository path and the CVS login credentials you have supplied. Or contact your CVS administrator.
Test Manager attempts a trial connection to CVS using the information you have provided. If the trial connection to CVS is successful, you are returned to the Source Control page.
Related Concepts Source Control Profiles Related Procedures Managing CVS Profiles Adding CVS Source Control Profiles Deleting Source Control Profiles Related Reference Source Control Profiles Page
311
Choose the Source Control tab from Test Manager Settings in the menu tree. The Source Control page displays, listing all of the source control profiles that have been created for the system. Click the Delete icon of the source control profile you wish to delete. A confirmation dialog box displays. Click Yes.
You are returned to the Source Control page. Related Concepts Source Control Profiles Related Reference Source Control Profiles Page
312
In This Section
1
Adding a new MSVSS profile: Adding MSVSS Source Control Profiles Editing an existing MSVSS profile: Editing MSVSS Source Control Profiles Deleting a MSVSS profile: Deleting Source Control Profiles
Related Concepts Source Control Profiles Related Reference Source Control Profiles Page
313
To open the New Source Control Profile dialog box: Select MSVSS or MSVSS (cmd line) from the Source control system list box. MSVSS (cmd line) utilizes the MSVSS command line plug-in, which works exactly like MSVSS, except that SilkCentral users are automatically logged out of MSVSS when the user logs out from SilkCentral. When selecting MSVSS, SilkCentral users remain logged in to MSVSS for an indefinite time.
If you selected MSVSS (cmd line), specify the location of the SourceSafe executable ss.exe. SourceSafe must be installed identically on all execution servers and the front-end server. This allows you to specify a definite path. For example, C:\Program Files\Microsoft Visual Studio\VSS\win32\ss.exe. If SourceSafe is installed in different locations, proceed as explained in sub task To configure the location of a SourceSafe client below.
In the SourceSafe database (srcsafe.ini) text box, type the UNC path and file name of the SourceSafe configuration file you want to access. Alternative: Click Browse to locate the SourceSafe configuration file. Note: SourceSafe configuration files use the name srcsafe.ini.
5 6 7 8
Type a valid UNC username and UNC password. These credentials are required to access the UNC path of the configuration file. Type the Working folder to which the Test Manager execution server should copy the source files. The working folder must be a local path. for example, C:\TempSources\. Type a valid SourceSafe Username and Password. These credentials will be used to access your MSVSS database. Type the Project path you want this profile to use. Alternative: Click Browse next to the Project path text box to connect to the MSVSS system that uses the credentials you have entered.
Click OK.
Note: If an error occurs, review the UNC path of the configuration file, the UNC login credentials, and the MSVSS login credentials you have supplied. Or contact your MSVSS administrator. Test Manager attempts a trial connection to CVS using the information you have provided. If the trial connection to CVS is successful, you are returned to the Source Control page.
In the SourceSafe executable text box, type ss.exe without any path information. On each execution server and on the front-end server, type the local path of the SourceSafe executable Settings Control Panel System. ss.exe to the Windows system path. To do this, click Start
314
3 4 5 6
The System Properties dialog box displays. Select the Advanced tab and click Environment Variables. The Environment Variables dialog box displays. Select the Path variable in the System variables section and click Edit. Add the local path of the SourceSafe executable to the list of existing Variable values. You can append a new variable value to existing values by entering a semicolon (;) followed by the path information. Repeat this procedure for each execution server and for the front-end server.
Related Concepts Source Control Profiles Related Procedures Managing Microsoft Visual SourceSafe (MSVSS) Profiles Editing MSVSS Source Control Profiles Deleting Source Control Profiles Related Reference Source Control Profiles Page
315
To open the New Source Control Profile dialog box: Edit the Name of the profile. This is the name that will be displayed in the Test Manager GUI. If your selected Source control system is MSVSS (cmd line), you can change the location of the SourceSafe executable ss.exe. Borland recommends to install SourceSafe identically on all execution servers and the front-end server. This enables you to specify a definite path. For example C:\Program Files \Microsoft Visual Studio\VSS\win32\ss.exe. If SourceSafe is installed in different locations, proceed as explained in sub task To configure the location of a SourceSafe client in the related procedure Adding MSVSS Source Control Profiles.
2 3
In the SourceSafe database (srcsafe.ini) text box, edit the UNC path and file name of the SourceSafe configuration file, or click Browse to locate the file. If you do not know the location of the configuration file, consult your SourceSafe administrator. Note: SourceSafe configuration files use the name srcsafe.ini.
5 6 7 8
Edit the UNC username and UNC password. These credentials are required to access your configuration files UNC path. Edit the Working folder to which the Test Manager execution server should copy the source files. The working folder must be a local path. For example C:\TempSources\. Edit the Username and Password. These credentials will be used to access your MSVSS database. Edit the Project path you want this profile to use. Alternative: Click Browse next to the Project path text box to connect to the PVCS system that uses the credentials you have entered.
Click OK.
Note: If an error occurs, review the path to the UNC configuration file, the UNC login credentials, and the MSVSS login credentials that you have supplied. Or contact your MSVSS administrator. Test Manager attempts a trial connection to MSVSS using the information you have provided. If the trial connection to MSVSS is successful, you are returned to the Source Control page. Related Concepts Source Control Profiles Related Procedures Managing Microsoft Visual SourceSafe (MSVSS) Profiles Adding MSVSS Source Control Profiles Deleting Source Control Profiles Related Reference Source Control Profiles Page
316
Choose the Source Control tab from Test Manager Settings in the menu tree. The Source Control page displays, listing all of the source control profiles that have been created for the system. Click the Delete icon of the source control profile you wish to delete. A confirmation dialog box displays. Click Yes.
You are returned to the Source Control page. Related Concepts Source Control Profiles Related Reference Source Control Profiles Page
317
In This Section
1
Adding a new SVN profile: Adding Subversion Source Control Profiles Editing an existing SVN profile: Editing Subversion Source Control Profiles Deleting an SVN profile: Deleting Source Control Profiles
Related Concepts Source Control Profiles Related Reference Source Control Profiles Page
318
To open the New Source Control Profile dialog box: Choose Subversion from the Source control system list box. Type the URL of the Subversion Repository you want to access. If you do not know the URL of the repository, consult your Subversion administrator. Type a valid Subversion Username and Password. These credentials will be used to access your Subversion repository. Type the Working folder to which the Test Manager execution server should copy the source files. The working folder must be a local path. For example, C:\TempSources\. Optional: Type the Project path that you want this profile to use. If the connection is successful, the Select Project Path dialog box will display. Leaving this text box empty sets the project path to the root directory. Alternative: Click Browse next to the Project path text box to connect to the Subversion system that uses the credentials you have entered. The Select Project Path dialog box opens. Select the desired project path in the tree view and click OK. Leaving this text box empty sets the project path to the root directory.
2 3
4 5 6
Click OK. Note: If an error occurs, review the repository path and the Subversion login credentials you have supplied. Or contact your Subversion administrator.
Test Manager attempts a trial connection to Subversion using the information you have provided. If the trial connection to Subversion is successful, you are returned to the Source Control page.
Related Concepts Source Control Profiles Related Procedures Managing Subversion Profiles Editing Subversion Source Control Profiles Deleting Source Control Profiles Related Reference Source Control Profiles Page
319
To launch the New Source Control Profile dialog: Choose from the following options:
Edit the Name of the profile. This is the name that will be displayed in the Test Manager GUI. Edit the URL of the Subversion repository you want to access. If you do not know the URL of the repository,
please consult your Subversion administrator.
Edit your Subversion Username and Password. Edit the Working folder to which the Test Manager execution server should copy the source files. The
working folder must be a local path. For example, C:\TempSources\.
Click OK. Note: If an error occurs, review the repository path and the Subversion login credentials you have supplied.
Test Manager attempts a trial connection to Subversion using the information you have provided. If the trial connection to Subversion is successful, you are returned to the Source Control page, where the new profile is listed.
Related Concepts Source Control Profiles Related Procedures Managing Subversion Profiles Adding Subversion Source Control Profiles Deleting Source Control Profiles Related Reference Source Control Profiles Page
320
Choose the Source Control tab from Test Manager Settings in the menu tree. The Source Control page displays, listing all of the source control profiles that have been created for the system. Click the Delete icon of the source control profile you wish to delete. A confirmation dialog box displays. Click Yes.
You are returned to the Source Control page. Related Concepts Source Control Profiles Related Reference Source Control Profiles Page
321
In This Section
1
Adding a new UNC profile: Adding UNC Source Control Profiles Editing an existing UNC profile: Editing UNC Source Control Profiles Deleting a UNC profile: Deleting Source Control Profiles
Related Concepts Source Control Profiles Related Reference Source Control Profiles Page
322
To open the New Source Control Profile dialog box: Select UNC from the Source control system list box. Type the UNC path that you want to access. This is the path to the location where your test definition sources are located. Type the Working folder to which the Test Manager execution server should copy the source files. The working folder must be a local path. For Example C:\TempSources\. Type a valid UNC Username and Password. These credentials will be used to access your UNC repository. Click OK.
2 3 4 5 6
Note: If an error occurs, review the repository path and the UNC login credentials you have supplied. Or contact your UNC administrator. Test Manager attempts a trial connection to UNC using the information you have provided. If the trial connection to UNC is successful, you are returned to the Source Control page. Related Concepts Source Control Profiles Related Procedures Managing UNC Profiles Editing UNC Source Control Profiles Deleting Source Control Profiles Related Reference Source Control Profiles Page
323
To launch the New Source Control Profile dialog: Choose from the following options:
Edit the Name of the profile. This is the name that will be displayed in the Test Manager GUI. Edit the UNC path. This is the path to where your test definition sources are located. Edit the Working folder to which the Test Manager execution server should copy the source files. The
working folder must be a local path. For example, C:\TempSources\.
Edit the UNC Username and Password. These credentials are required to access your UNC repository.
3
Click OK.
Note: If an error occurs, review the repository path and the UNC login credentials you have supplied. Test Manager attempts a trial connection to UNC using the information you have provided. If the trial connection to UNC is successful, you are returned to the Source Control page, where the new profile is listed. Related Concepts Source Control Profiles Related Procedures Managing UNC Profiles Adding UNC Source Control Profiles Deleting Source Control Profiles Related Reference Source Control Profiles Page
324
Choose the Source Control tab from Test Manager Settings in the menu tree. The Source Control page displays, listing all of the source control profiles that have been created for the system. Click the Delete icon of the source control profile you wish to delete. A confirmation dialog box displays. Click Yes.
You are returned to the Source Control page. Related Concepts Source Control Profiles Related Reference Source Control Profiles Page
325
In This Section
1
Adding a new VFS profile: Adding VFS Source Control Profiles Editing an existing VFS profile: Editing VFS Source Control Profiles Deleting a VFS profile: Deleting Source Control Profiles
Related Concepts Source Control Profiles Related Reference Source Control Profiles Page
326
To open the New Source Control Profile dialog box: Select VFS from the Source control system list box. Type the URL of the VFS Repository you want to access. Specify the appropriate protocol type in the URL:
2 3
FTP - ftp://<ftp server URL> HTTP - http://<http server URL> SMB - smb://<Samba server url>
Note: HTTP, FTP and SMB are also supported for zipped files. In order to point to a zipped file the URL must be adjusted to <zipped file type>:<protocol>://<server URL pointing to zipped file> to include the type of the zipped file. For example, zip:http://193.80.200.135/<path>/archive.zip or jar:http:// 193.80.200.135/<path>/archive.jar.
Type a valid VFS Username and Password. These credentials will be used to access your VFS repository. The SMB protocol allows including the domain name in the username in the following form: domain/ username. Enter the Working folder to which the Test Manager execution server should copy the source files. The working folder must be a local path. For example, C:\TempSources\. Optional: Type the Project path that you want this profile to use. If the connection is successful, the Select Project Path dialog box will display. Leaving this text box empty sets the project path to the root directory. Alternative: Click Browse next to the Project path text box to connect to the VFS system that uses the credentials you have entered. The Select Project Path dialog box opens. Select the desired project path in the tree view and click OK. Leaving this text box empty sets the project path to the root directory.
5 6
Click OK.
Note: If an error occurs, review the repository path and the VFS login credentials you have supplied. Or contact your VFS administrator. Test Manager attempts a trial connection to VFS using the information you have provided. If the trial connection to VFS is successful, you are returned to the Source Control page.
327
Related Concepts Source Control Profiles Related Procedures Managing VFS Profiles Editing VFS Source Control Profiles Deleting Source Control Profiles Related Reference Source Control Profiles Page
328
To launch the New Source Control Profile dialog: Choose from the following options:
Edit the Name of the profile.. This is the name that will be displayed in the Test Manager GUI. Edit the URL of the VFS Repository you want to access. Edit the VFS Username and Password. These credentials will be used to access your VFS repository. Type the Working folder to which the Test Manager execution server should copy the source files. The
working folder must be a local path. for example, C:\TempSources\.
Click OK.
Note: If an error occurs, review the repository path and the VFS login credentials you have supplied. Or contact your VFS administrator. Test Manager attempts a trial connection to VFS using the information you have provided. If the trial connection to VFS is successful, you are returned to the Source Control page. Related Concepts Source Control Profiles Related Procedures Managing VFS Profiles Adding VFS Source Control Profiles Deleting Source Control Profiles Related Reference Source Control Profiles Page
329
Choose the Source Control tab from Test Manager Settings in the menu tree. The Source Control page displays, listing all of the source control profiles that have been created for the system. Click the Delete icon of the source control profile you wish to delete. A confirmation dialog box displays. Click Yes.
You are returned to the Source Control page. Related Concepts Source Control Profiles Related Reference Source Control Profiles Page
330
Settings .
If you have not already selected a project, a warning message will appear, asking you to select a project. Select the project for which you want to define global settings.
2 3 4
Select the Project Settings tab to view the current settings. The Project Settings page displays the current project settings. Click Edit to modify the current project settings. The Edit Project Settings dialog box displays. You can specify the following information:
Build Information File Name Build information files contain project information, including build number, Project Release Date Enter your projects planned release date in the format MM/DD/YYYY. File Extensions to ignore in Results Specify result file types or other file types that should not be
saved as results for test executions. Note: Note: File extensions must be separated by commas (for example, xlg, *_, res). Changes made in the Build Information File Name and File Extensions to ignore in Results fields will not affect scheduled test definitions. To redistribute tasks to execution servers, you must reschedule test definitions, or disconnect from and reconnect to the database.
build log location, error log location, and build location. Enter the name of your projects build information file in this field. All test executions will read the build information from this specified file.
Related Concepts Settings Configuration Related Procedures Configuring Projects - Quick Start Task Configuring Test Manager Settings Related Reference Settings Unit Interface
331
Managing Requirements
This section explains how to work with requirements in Test Manager. In This Section Creating Requirements This section explains how to create requirements with Test Manager. Customizing Requirement Properties This section explains how to customize requirement properties with Test Manager. Integrating External RM Tools This section explains how to integrate an external requirements management tool with Test Manager. Collapsing or Expanding the Requirements tree Describes how to consolidate and display levels of the hierarchy based on your viewing needs. Switching Between Full and Direct Coverage Modes Describes how to switch between full and direct coverage modes.
332
Creating Requirements
This section explains how to create requirements with Test Manager. In This Section Managing Requirement Attachments This section explains how to manage requirement attachments with Test Manager. Configuring Requirement Types Describes how to configure a requirement type. Creating Requirements Describes how to create requirements directly in Test Manager. Assigning Test Definitions from Grid View to Requirements Describes how to assign test definitions from Grid View to requirements. Assigning Test Definitions to Requirements Manually Describes how to manually assign test definitions to requirements. Creating Child Requirements Describes how to create child requirements. Editing Requirements Describes how to edit requirements. Finding Requirement Properties How to find requirement properties. Generating Test Plans from Requirements View How to generate a new test plan from Requirements View. Locating Assigned Test Definitions in the Test Plan Tree How to locate assigned test definitions in the Test Plan tree. Marking Requirements as Obsolete How to make a requirement as obsolete, rather than deleting it. Removing Test Definition Assignments How to remove a test-definition assignment from a requirement. Replacing Requirement Properties How to replace a requirement property. Sorting the Assigned Test Definitions Tab How to sort test definitions on the Assigned Test Definitions tab. Tracking the History of a Requirement How to track the history of a requirement.
333
334
Click Requirements on the workflow bar. Select a requirement in the Requirement tree view. Select the Attachments tab. When requirements management integration has been enabled between a Test Manager project and a CaliberRM project, the Attachments tab (Requirements Attachments) includes an Open CaliberRM button, which enables you to manage requirement attachments directly in CaliberRM.
4 5 6 7
Click Upload File to open the Upload File dialog box. Using Browse, select a file from your local file system. Enter a meaningful Description for the attachment. Click OK to upload the attachment to the server and associate it with the selected requirement.
Note: Attaching files to a test plan element may not work in Mozilla Firefox. Firefox requires usage of three slashes (for example: "file:///") for a file link, while other browsers require only two (for example: "file://"). Aditionally, Firefox includes a security feature blocking links from remote files to local files and directories. For more information, see http://kb.mozillazine.org/Firefox_:_Issues_:_Links_to_Local_Pages_Don't_Work Related Concepts Attachments Requirements Management Related Procedures Managing Requirements - Quick Start Task Managing Requirements Related Reference Requirement Attachments tab
335
Click Requirements on the workflow bar. Select a requirement in the requirement tree view. Select the Attachments tab. When requirements management integration has been enabled between a Test Manager project and a CaliberRM project, the Attachments tab (Requirements Attachments) includes an Open CaliberRM button, which enables you to manage requirement attachments directly in CaliberRM.
4 5 6 7
Click Attach Link to open the Attach Link dialog box. Enter a URL in the Name field. Enter a meaningful description for the attached link. Click OK to associate the link with the selected requirement.
Related Concepts Attachments Requirements Management Related Procedures Managing Requirements - Quick Start Task Managing Requirements Related Reference Requirement Attachments tab
336
Click Requirements on the workflow bar. Select the requirement in the Requirement tree view for which you want to delete an attachment. Select the Attachments tab to see a list of all attachments that are associated with the requirement. When requirements management integration has been enabled between a Test Manager project and a CaliberRM project, the Attachments tab (Requirements Attachments) includes an Open CaliberRM button, which enables you to manage requirement attachments directly in CaliberRM.
4 5
Click the Delete icon of the attachment you want to delete. Click Yes on the confirmation dialog to delete the attachment from the project. Note: Only one attachment at a time can be deleted.
Related Concepts Attachments Requirements Management Related Procedures Managing Requirements - Quick Start Task Managing Requirements Related Reference Requirement Attachments tab
337
Click Requirements on the workflow bar. Select the requirement in the Requirement tree view for which you want to edit a requirement attachment description. Select the Attachments tab to see the list of attachments that are associated with the requirement. When requirements management integration has been enabled between a Test Manager project and a CaliberRM project, the Attachments tab (Requirements Attachments) includes an Open CaliberRM button, which enables you to manage requirement attachments directly in CaliberRM.
4 5 6
Select the attachment for which you want to edit the description and click Edit. Edit the description on the Edit File Attachment dialog box. Click OK to save your changes.
Related Concepts Attachments Requirements Management Related Procedures Managing Requirements - Quick Start Task Managing Requirements Related Reference Requirement Attachments tab
338
Click Requirements on the workflow bar. From the Requirements tree view, select the requirement for which you want to view an attachment. Select the Attachments tab to see a list of all attachments that are associated with the requirement. Each attachment name serves as a link. File-attachment links open Save As dialog boxes, enabling you to download attachments to your local file system. Link-attachments link directly to link destinations in newly spawned browser windows. Note: When requirements management integration has been enabled between a Test Manager project and a CaliberRM project, the Attachments tab (Requirements Attachments) includes an Open CaliberRM button, which enables you to manage requirement attachments directly in CaliberRM.
Related Concepts Attachments Requirements Management Related Procedures Managing Requirements - Quick Start Task Managing Requirements Related Reference Requirement Attachments tab
339
Click Requirements on the workflow bar. Note: Configuration of requirement type for CaliberRM, Requisite Pro and DOORS is only enabled for top-level requirements in the tree (requirements that are a direct child of the project node). All other requirements share the requirement type of their parents. A requirement without a configured requirement type is not available for upload. Import of requirements automatically configures appropriate requirement type.
2 3
From Requirements View, at the requirement level, select the Properties tab. Click Map Requirement to select a requirement type from the list. Requirement type is a categorization used by CaliberRM, Requisite Pro, and DOORS and is required for synchronization. Note: Map Requirement is only enabled when external requirements integration is enabled in the Settings unit (Integrations Configuration tab.) and if the requirement has not yet been uploaded to the external requirements management tool. Additionally, the option Enable upload of requirements to... must be enabled.
Related Concepts Requirements Management Related Procedures Managing Requirements - Quick Start Task Test Coverage Status Managing Requirements Related Reference Requirement Properties tab
340
Creating Requirements
Test Manager allows you to create new requirements, edit and delete existing requirements, and add custom property fields to requirements. Newly created Test Manager projects do not contain requirements.
Requirements.
Click New Requirement on the toolbar. Note: If the project you are working with does not yet have any requirements associated with it, click the <Click here to add Requirements> link in the Requirements tree to open the New Requirement dialog box.
On the New Requirement dialog box, enter a meaningful Name and Description for the requirement. Note: Test Manager supports HTML formatting and cutting and pasting of HTML content for description fields.
4 5
Select the appropriate Priority, Risk, and Reviewed status from the list boxes. If custom requirements habe been defined, enter in the Custom Property text box any custom property data that you want tracked with this requirement. Note: The Priority, Risk, Reviewed, and any Custom Property fields will be configured automatically with the corresponding properties of the parent requirement if you check the Inherit from parent check boxes for these properties.
Click OK to create a new top-level requirement. Note: Alternatively, you can click OK and New Requirement to both save the newly created requirement and open the New Requirement dialog box to create an additional top-level requirement. Or, you can click OK and New Child Requirement to have the New Child Requirement dialog box open after the new top-level requirement is created.
Related Concepts Requirements Management Related Procedures Managing Requirements - Quick Start Task Creating Child Requirements Managing Requirements Related Reference Requirements Unit Interface HTML Support for Description Text Boxes
341
Test Definition Name Test Definition Status Last Execution of the test definition
To assign one or more test definitions from the test plan Grid View to one or more requirements:
1 2 3 4 5 6 7 8
Click Test Plan on the workflow bar. Click Grid View on the toolbar Select the test definitions you want to assign to requirements. You can use your keyboard's Ctrl and Shift keys to select multiple test definitions using standard browser multi-select functions. Right-click the selected test definitions and choose Save Selection. Click Requirements on the workflow bar. Select the requirement to which you want to insert the selected test definitions. Choose Assigned Test Definitions. Click Assign Saved Selection. Note: Note: Only test definitions that reside in the requirements test container are assigned. You can assign the selected test definitions to more than one requirement. You can not assign them into requirements in a different project. The selection persists until you make a different selection or close Test Manager.
Related Concepts Requirements Management Related Procedures Managing Requirements - Quick Start Task Creating Requirements Assigning Test Definitions to Requirements Manually Related Reference Assigned Test Definitions tab
342
Click Requirements on the workflow bar. In the Requirements tree view, select the requirement to which you want to assign test definitions. In Requirements View, select the Assigned Test Definitions tab. Note: The Available Test Definitions window can be expanded/collapsed by clicking the black triangular button on the window splitter (the left-hand edge of the window).
Click the arrow of any test definition you want to assign to the currently selected requirement. Clicking the arrow of a test container or test folder assigns the test definitions that are located in those containers or folders to the selected requirement (test definitions that are located within sub-folders of those containers and folders are also assigned).
Related Concepts Test Coverage Status Requirements Management Related Procedures Managing Requirements - Quick Start Task Managing Requirements Assigning Test Definitions from Grid View to Requirements Related Reference Assigned Test Definitions tab
343
Click Requirements on the workflow bar. In the tree view, select the requirement under which you would like to create a child requirement. Select New Child Requirement to open the New Child Requirement dialog box. For each of the available requirement property fields, you have the option of having the child requirement inherit its values from its parent. By default, all Inherit from parent check boxes are checked, and so all parent traits are inherited by default. To specify a property value other than that held by its parent, uncheck the corresponding Inherit from parent check box to unlock that propertys list box or edit field. Then select the specific value that the child requirement is to have. In Document View, asterisks (*) are placed next to requirement properties for which values have been inherited from parent requirements. Note: Child requirements can be created at any level in the tree view hierarchy other than the top level. There is virtually no limit to the number of child requirements that can be inserted at a single hierarchy level.
Related Concepts Requirements Management Related Procedures Managing Requirements - Quick Start Task Creating Requirements Managing Requirements Related Reference Requirements Unit Interface
344
Editing Requirements
To edit requirement properties
1 2 3
Click Requirements on the workflow bar. Select a requirement in the Requirements tree. The properties of the selected requirement are displayed on the Properties tab. Click Edit Properties on the Properties tab to open the Edit Requirement dialog box. Note: The Edit Requirement dialog box can also be accessed through Edit toolbar and by rightclicking a requirement in the Requirements tree and selecting Edit.
4 5
Edit the values displayed on the Edit Requirements dialog box as required. The default behavior is to inherit values from the parent requirement. Uncheck Inherited from parent check boxes to disable value inheritance. Click OK to save your changes. Note: For details regarding creating, editing, and deleting custom requirement properties, see Custom Requirement Properties.
Related Concepts Requirements Management Related Procedures Managing Requirements - Quick Start Task Managing Requirements Related Reference Requirements Unit Interface
345
To find a requirement:
1 2
Click Requirements on the workflow bar. Select on the toolbar to open the Find dialog box. Note: This command can also be executed by right-clicking a requirement and selecting Find.
3 4
From the Find in list box, select a requirement property to be searched. This list is automatically populated with all standard property fields and any custom property fields you may have created. Define your search criteria in the Find what portion of the dialog box. Note: The Find what portion of the dialog box offers fields and list boxes with pre-populated values that are based on the property you select in the Find in list box above. The UI controls available in the Find what portion of the dialog box also vary based on the property type selected in the Find in list box. For example, selecting a custom date-type property enables one or more date fields. To specify an exact date, select exactly from the Find in list box. Then click next to the date field to specify a date using the calendar tool. Alternatively, you can select before or after from the list box. To select a date range, select between from the list box. Then click fields to specify start and end dates. next to the date
Selecting the Reviewed property enables a list box from which you can select either Yes or No. Selecting the Requirement name or Description property enables a text box in which you can enter a text string.
5
Click OK to begin your search. The first requirement that meets the search criteria will be highlighted in the tree view. Click Find Next on the Find dialog box to advance to the next requirement in the list that meets your search criteria. Click Find Previous on the Find dialog box to return to the previous requirement in the list that meets your search criteria.
Related Concepts Custom Requirement Properties Requirements Management Related Procedures Managing Requirements - Quick Start Task Customizing Requirement Properties Managing Requirements Related Reference Requirement Properties Page
346
Click Requirements on the workflow bar. From Requirements View, with at least one requirement available in the Requirements tree, right-click the requirement or project node that is to be converted into a Test Plan tree. Select Generate Test Plan to display the Generate Test Plan from Requirements dialog box. This dialog box enables you to specify whether the leaves (lowest-level nodes) of the selected requirements subtree should be converted into test definitions or test folders; and whether the tree should be generated into a new test container or an existing container. Enter a name for the new test container in the Enter Name field and select a product from the Select Product list box to create the container within the active Test Manager project. The Select Product list box is populated with the products that are configured by a project manager. See SilkCentral Administration Module documentation or ask your project manager for detailed information. If you have defined a source control profile (see SilkCentral Administration Module documentation or ask your Test Manager administrator for detailed information) select the source control profile you want to use for managing the test definition sources from the Select Source Control Profile list box. To include all child requirements of the selected requirement in the test plan, check the Include child requirements check box (the default). To have the new test definitions that you generate automatically assigned to the requirements from which they are created, check the Assign newly generated Test Definitions to Requirements check box. If this option is not selected, test definitions must be manually associated with requirements. Note: This option is not available when checking Generate Test Folders from Requirement Tree leaves.
2 3
4 5 6
7 8
Click OK to create the test plan, which has the same structure as the Requirements tree. A message displays, asking if you want to switch directly to the Test Plan unit. Click Yes to view the test plan in Test Managers Test Plan unit, or click No to remain in the Requirements unit.
Related Concepts Test Plan Generation Requirements Management Related Procedures Managing Requirements - Quick Start Task Managing Requirements Related Reference Requirements Unit Interface
347
Click Requirements on the workflow bar. Select a requirement in the Requirements tree that has at least one test definition assigned to it. Select the Assigned Test Definitions tab. In the Actions column of a test definition, click is stored in. to find out in which test folder or container the test definition
The corresponding test folder/container is then highlighted in the Test Plan window.
Related Concepts Requirements Management Related Procedures Managing Requirements - Quick Start Task Test Coverage Status Managing Requirements Related Reference Assigned Test Definitions tab
348
Right-click the requirement you want to edit. Select Delete . Make sure that the Destroy permanently check box is not checked and click Yes.
Click Requirements on the workflow bar. Right-click a requirement in the tree view. Select Recover .
Click Requirements on the workflow bar. Select a requirement in the tree view. Select Delete . Make sure that the Destroy permanently check box is checked and click Yes. Click Yes on the Delete Requirement dialog box.
Related Concepts Requirements Management Related Procedures Managing Requirements - Quick Start Task Managing Requirements Related Reference Requirements Unit Interface
349
Click Requirements on the workflow bar. Select a requirement (in the Requirements tree) that has at least one test definition assigned to it. In the Actions column of the test definition you want to remove, click Delete. Click Yes on the confirmation dialog box to confirm deletion of the assignment. Note: To remove all test-definition assignments from the selected requirement, click Remove All.
Related Concepts Test Coverage Status Requirements Management Related Procedures Managing Requirements - Quick Start Task Managing Requirements Related Reference Assigned Test Definitions tab
350
Click Requirements on the workflow bar. Select a requirement in the Requirements tree. Click Replace on the toolbar to open the Replace dialog box. Note: This command can also be executed by right-clicking a requirement and selecting Replace.
From the Find in list box, select a requirement property to be searched. This list is automatically populated with all standard property fields and any custom property fields you may have created. 5 Define your search criteria in the Find what portion of the dialog box.
4
Note:
The Find what portion of the dialog box offers fields and list boxes with pre-populated values that are based on the property you select in the Find in list box above. The UI controls available in the Find what portion of the dialog box also vary based on the property type selected in the Find in list box. For example, selecting a custom date-type property enables one or more date fields. To specify an exact date, select exactly from the Find in list box. Then click next to the date field to specify a date using the calendar tool. Alternatively, you can select before or after from the list box. To select a date range, select between from the list box. Then click fields to specify start and end dates. next to the date
Selecting the Reviewed property enables a list box from which you can select either Yes or No. Selecting the Requirement name or Description property enables a text box in which you can enter a text string.
6 7 8
In the Replace with text box, enter the alternate property data that you want to have replace the identified data. Click OK to find the first instance of the property you want to replace. The first requirement that meets the search criteria will be highlighted in the tree view. Click Replace to replace only the selected instance of the property data. Click Replace all to replace all instances of the property data throughout all requirements in the project. Note: Using the Replace all option will overwrite inherited requirement properties with the new value, thus removing the inheritance setting of a child requirement. Use the Replace option only on a parent requirement if you want the child requirements to inherit the new value.
Click Find Next on the Find dialog box to advance to the next requirement in the list that meets your search criteria. Or click Find Previous on the Find dialog box to return to the previous requirement in the list that meets your search criteria.
351
Related Concepts Requirements Management Related Procedures Managing Requirements - Quick Start Task Managing Requirements Related Reference Requirement Properties Page
352
Click Requirements on the workflow bar. Select a requirement (in the Requirements tree) that has more than one test definition assigned to it. Click the column header of the property by which you want to sort the test definitions. A small upward- or downward-pointing arrow indicates both the column upon which the sort has been based and the direction of the sort (ascending or descending). If required, click the column header again to reverse the direction of the sort.
Related Concepts Test Coverage Status Requirements Management Related Procedures Managing Requirements - Quick Start Task Managing Requirements Related Reference Requirement Coverage tab Assigned Test Definitions tab
353
Click Requirements on the workflow bar. Select a requirement in the Requirements tree. Select Requirements views History tab. When requirements management integration has been enabled between a Test Manager project and a CaliberRM project, the History tab (Requirements History) includes a Open CaliberRM button, which enables you to view the history of synchronized requirements directly in CaliberRM.
The properties of all revisions that have been logged by Test Manager are displayed in tabular format.
Related Concepts Requirement History Requirements Management Related Procedures Managing Requirements - Quick Start Task Viewing Recent Changes Managing Requirements Related Reference Requirement History tab
354
355
Settings .
Select the Requirement Properties tab. Click New Property to display the New Custom Requirement Property dialog box. Enter a name for the new property in the Name field. Select the data Type of the new property (integer, string, Boolean, or Date) from the Type list box. Click OK to make your custom property available to all requirements in the active Test Manager project.
Related Concepts External Requirements Management Tools Requirements Management Related Procedures Managing Requirements - Quick Start Task Working with External Properties Managing Requirements Customizing Requirement Properties Finding Requirement Properties Related Reference Requirement Properties tab
356
Settings .
Select the Requirement Properties tab. Click the Delete icon to display the Delete Custom Requirement Property confirmation dialog box. Click Yes to confirm the deletion.
Related Concepts External Requirements Management Tools Requirements Management Related Procedures Managing Requirements - Quick Start Task Working with External Properties Managing Requirements Customizing Requirement Properties Related Reference Requirement Properties tab
357
Settings .
Select the Requirement Properties tab. Click the name of the property you want to edit. The Edit Custom Requirement Property dialog box displays. Edit the name of the property in the Name field. Click OK to save your changes.
Related Concepts External Requirements Management Tools Requirements Management Related Procedures Managing Requirements - Quick Start Task Working with External Properties Managing Requirements Customizing Requirement Properties Related Reference Requirement Properties tab
358
359
360
From the project to which you want to establish integration, select the Settings link on the menu tree. Select the Integrations Configuration tab. Click the Borland Caliber RM Configure button to display the Edit Configuration dialog box. Enter the Hostname of the machine where the external server is installed. Enter valid Username and Password credentials for the requirements management server. Click Test Connection to confirm that the host and user credentials you have entered are correct. You will receive a Test connection was successful dialog if the settings are correct. Click OK. Note: Consult your system administrator if you are not able to establish a connection.
From the Project field, select the external project with which the Test Manager project is to be integrated. The requirement types that are available with the selected project are automatically populated into the Requirement Types field. The baselines that are available with the selected project are automatically populated into the Baseline field. Select a baseline from the external project that should be integrated with the Test Manager project. Select one or more requirement types from the external project that should be integrated with the Test Manager project (Hold down the CTRL key to select multiple requirement types). Your selections are displayed on the Edit Configuration dialog box. Click OK. Back on the Edit Configuration dialog box, check the Enable creation of unassigned requirements check box to enable creation and editing of unmapped requirements in Test Manager projects that are configured for integration with CaliberRM. Check the Enable upload of requirements to CaliberRM check box to enable the upload of unmapped/unassigned requirements from Test Manager to CaliberRM. This allows you to upload additional previously unmapped requirement trees to CaliberRM and then have those requirements mapped within Test Manager. When this option is enabled, the Map Requirement button becomes enabled (Requirements Properties), enabling configuration of top level requirements for external requirement types, which is required when uploading unmapped requirements. Click OK to save your settings.
Related Concepts External Requirements Management Tools Baseline Support for CaliberRM Integration Requirements Management Related Procedures Managing Requirements - Quick Start Task Copying CaliberRM-Integrated Projects Working with External Properties Managing Requirements Customizing Requirement Properties Related Reference Requirement Properties tab
361
From the project to which you want to establish integration, select the Settings link on the menu tree. Select the Integrations Configuration tab. Click the IBM Rational RequisitePro Configure button to display the Edit Configuration dialog box. Enter (or click Browse and select) the UNC project path to the machine where the external server is installed. Enter the UNC Username and UNC Password of the machine where the external server is installed. Enter valid User name and Password credentials for the requirements management server. ClickTest Connection to confirm that the host and user credentials you have entered are correct. You will receive a Test connection was successful dialog box if the settings are correct. Click OK. Note: Consult your system administrator if you are not able to establish a connection.
Click Edit Packages and Requirement Types to open the Browse Packages & Requirement Types dialog box. The packages and requirement types that are available with the selected project are automatically populated into the Packages and Requirement Types fields. In the Packages field, select one or more packages from the external project that should be integrated with the Test Manager project (Hold down the CTRL key to select multiple packages). In the Requirement types field, select one or more requirement types from the external project that should be integrated with the Test Manager project (Hold down the CTRL key to select multiple requirement types). Click OK. Your selections are then displayed on the Edit Configuration dialog. Note: Only requirements of explicitly selected packages will be synchronized. Selecting a parent package does not select the child packages of the parent.
Back on the Edit Configuration dialog box, check the Enable creation of unassigned requirements check box to enable creation and editing of unmapped requirements in Test Manager projects that are configured for integration with RequisitePro. Check the Enable upload of requirements to RequisitePro check box to enable the upload of unmapped/unassigned requirements from Test Manager to RequisitePro. This allows you to upload additional previously unmapped requirement trees to RequisitePro and then have those requirements mapped within Test Manager. When this option is enabled, the Map Requirement button becomes enabled (Requirements Properties), enabling configuration of top level requirements for external requirement types, which is required when uploading unmapped requirements. Click OK to save your settings.
362
Related Concepts External Requirements Management Tools Baseline Support for CaliberRM Integration Requirements Management Related Procedures Managing Requirements - Quick Start Task Working with External Properties Managing Requirements Customizing Requirement Properties Related Reference Requirement Properties tab
363
To install the DOORS client on the Test Manager front-end server machine:
1 2
Download the DOORS plug-in package (two zip archives: DoorsRMPlugin.zip and Tools. DoorsClientLibs.zip) from Help Create a new folder with the name testmanager in the \lib\dxl folder of your Telelogic DOORS client installation. The default pathname for this folder is C:\Program Files\Telelogic\DOORS_8.20\lib \dxl\testmanager. Extract all DOORS script files from DoorsClientLibs.zip to this folder. The plug-in package DoorsRMPlugin.zip is automatically installed to the Plugins folder of your Test Manager application server installation during the setup process. During startup of the application server, this plug-in will be published to all front-end servers.
3 4
From the Test Manager project to which you want to establish integration, navigate to Settings Configuration.
Integrations
Click the Telelogic DOORS Integration Configure button to display the Edit Configuration dialog box. In the RM service URL field, enter the URL of Test Manager's DOORS requirement Web Service. The default value should point to the correct location already. Example: http://MySCTMHost:19120/services/doorsrequirementsmanagement Enter valid Username and Password credentials for the requirements management server. The default DOORS client installation path is displayed in the DOORS Installation Path field on the Edit Configuration dialog. If this path is not correct, click Browse to browse to and select the correct destination in the front-end server directory structure. Click Test Connection to confirm that the host and user credentials you have entered are correct. You will receive a Connection to Telelogic DOORS was successful message if the settings are correct. Click OK to proceed. Note: Consult your system administrator if you are not able to establish a connection.
4 5
Click the second Browse button (alongside the Project name field) to advance to the Browse Requirement Types dialog box. From the Project field, select the external project with which the Test Manager project is to be synchronized. The requirement types that are available with the selected project are automatically populated into the Requirement types field. Select the requirement types that are to be synchronized (hold down the Ctrl key to select multiple requirement types) and click OK.
8 9
Your selections are now displayed on the Edit Configuration dialog box. Click OK. Back on the Edit Configuration dialog box, check the Enable creation of unassigned requirements check box to enable creation and editing of unmapped requirements in Test Manager projects that are configured for integration with DOORS. Check the Enable upload of requirements to Telelogic DOORS check box to enable the upload of unmapped/unassigned requirements from Test Manager to DOORS. This allows you to upload additional previously unmapped requirement trees to DOORS and then have those requirements mapped within Test Manager. When this option is enabled, the Map Requirement button becomes enabled (Requirements 364
Properties), enabling configuration of top level requirements for external requirement types, which is required when uploading unmapped requirements. Click OK to save the configuration data to the database. Warning: As the DOORS application object is used for communication (and this object does not support login data, but rather requires a running DOORS client), Test Manager starts each DOORS client process with the provided login data and then uses that same data for all subsequent application objects. Therefore only one set of DOORS login credentials is supported for communication at one time. It is recommended that you use the same DOORS credentials for all configurations so that integration tasks can be performed on the front-end server for all projects at the same time. When a second set of credentials is used, the second set only works after all sessions using of the first set of credentials have timed out.
Related Concepts External Requirements Management Tools Requirements Management Related Procedures Managing Requirements - Quick Start Task Working with External Properties Managing Requirements Customizing Requirement Properties Related Reference Requirement Properties tab
365
366
Integrations Configuration and verify that the baseline you want to save is selected.
If the correct baseline is not selected, click Edit Configuration. The Edit Configuration dialog box displays. Click Browse next to the Project name field. On the Browse Projects dialog box, select the baseline you want to save, then confirm your selection. When a baseline is changed, before an associated Test Manager project can be copied, a synchronization must be performed to update the project requirements with the baseline changes. The integration configuration is only copied if a baseline other than the current baseline is selected. If the current baseline is selected, the user is prompted to specify if they want to keep the integration configuration in the original project or move to the copied project. Projects and click Copy Project in the Actions column of the project you want
Note:
Note:
This is a SilkCentral Administration Module task. See SilkCentral Administration Module documentation for full details regarding copying projects.
3 4
The Copy Project dialog box displays. Select the items you want to copy into the new project, then confirm your selection. After the project has successfully been copied, apply the baseline you want to continue working with to the project you are working on. Note: It does not matter if you continue working with the original project or a copy of the project. After copying the project, both the original and the copy are identical. By applying the correct baseline you determine which project you are working on.
Related Concepts Baseline Support for CaliberRM Integration External Requirements Management Tools Requirements Management Related Procedures Managing Requirements - Quick Start Task Working with External Properties Managing Requirements Customizing Requirement Properties Related Reference Requirement Properties tab
367
368
Click Requirements on the workflow bar. Select the requirement for which you intend to edit external properties. Select the Properties tab. Click Edit External Properties to display the Edit External Properties dialog box. All properties of the external requirement are displayed here. Edit all properties as required. Note: Editable properties on this dialog box offer input fields and controls with which you can edit the properties. If a mapping rule exists for an attribute, the attribute will be tagged with a trailing asterisk (*).
Related Concepts External Requirements Management Tools Requirements Management Related Procedures Managing Requirements - Quick Start Task Working with External Properties Managing Requirements Customizing Requirement Properties Related Reference Requirement Properties tab
369
Click Requirements on the workflow bar. Select the requirement for which you intend to edit external properties. Select the Properties tab. Click View External Properties to display the View External Properties dialog box. All properties of the external requirement are displayed. Close the dialog box.
Related Concepts External Requirements Management Tools Requirements Management Related Procedures Managing Requirements - Quick Start Task Working with External Properties Managing Requirements Customizing Requirement Properties Related Reference Requirement Properties tab
370
From the project for which you are deleting property mapping, select the Settings link on the menu tree. Click Edit Property Mapping for the configured external tool. Select the property-mapping value pair in the Custom property mapping select box. Click Remove Mapping. Click OK on the Edit Property Mapping dialog box to save your changes.
Related Concepts Requirements Management Related Procedures Managing Requirements - Quick Start Task Working with External Properties Managing Requirements Related Reference Requirement Properties tab
371
Select the project, and then select the Settings link on the menu tree. Click the Disable Configuration button of the requirements-management tool for which you want to disable integration. All integration data and functionality is then disabled, but not deleted from the project.
Related Concepts External Requirements Management Tools Requirements Management Related Procedures Managing Requirements - Quick Start Task Working with External Properties Managing Requirements Customizing Requirement Properties Related Reference Requirement Properties tab
372
Select the project, and then select the Settings link on the menu tree. Click Edit Property Mapping for the configured external tool. Select an external requirement type from the Requirement types list. All custom requirements of that type are then displayed below in the selection box. Select the custom requirement property for which you are establishing mapping. From the list box on the right, select the Test Manager custom property to establish mapping to the external custom property you have selected. Click Add Mapping to map the requirements. The results are displayed in the Custom property mapping box. The System property mapping box displays the two pre-configured mappings for requirement name and description, which cannot be removed. Click OK on the Edit Property Mapping dialog box to save your changes.
Related Concepts Requirements Management Related Procedures Managing Requirements - Quick Start Task Working with External Properties Managing Requirements Customizing Requirement Properties Related Reference Requirement Properties tab
373
Select the project, and then select the Settings link on the menu tree. Click Remove Configuration of the requirements-management tool for which you want to remove integration (this button is only enabled if the integration configuration has been disabled). Click Yes on the Remove External Integration dialog box to delete the configuration. All related data is then deleted from the database.
Related Concepts External Requirements Management Tools Requirements Management Related Procedures Managing Requirements - Quick Start Task Working with External Properties Managing Requirements Customizing Requirement Properties Related Reference Requirement Properties tab
374
Click Requirements on the workflow bar. Select the Project node in the Requirements tree view. Select the Properties tab. Click Synchronize Requirements. Click Yes on the Synchronize Requirements confirmation dialog box to begin synchronization. A dialog box opens when synchronization is complete, displaying synchronization statistics, including the number of requirements that have been created, updated, and deleted. Click OK to complete the synchronization. Any updates that were made to mapped requirements in your externally configured requirements management tool are now reflected in the Requirements tree in Test Manager.
Automatic synchronization of requirements between Test Manager and external requirements management tools can be configured to occur based on global schedules.
Click Settings on the workflow bar. Select the Integrations Configuration tab. Click Edit Schedule. The Edit Schedule dialog box opens. Click the Global option button. Select a pre-defined global schedule from the selection list. Note: See SilkCentral Administration Module documentation for details about configuring global schedules.
Click OK.
Notification settings can be defined to alert users through email when errors occur during automated synchronization of requirements between Test Manager and external requirements management tools. Notification recipients receive copies of synchronization log files.
Click Settings on the workflow bar. Select the Integrations Configuration tab. Click Edit Notification. The Edit Notification dialog box displays. Check the Enable notification check box. Select a user name from the Username list. Note: See SilkCentral Administration Module documentation for details about defining users.
6 7
If required, add additional e-mail addresses for other recipients in the Other email addresses text-entry box. Use semicolons to separate multiple e-mail addresses. Click OK.
375
Related Concepts Synchronizing Requirements Requirements Management Related Procedures Managing Requirements - Quick Start Task Working with External Properties Managing Requirements Customizing Requirement Properties Related Reference Requirement Properties tab
376
Click Requirements on the workflow bar. Right-click a requirement folder within the Requirements tree and select a collapse or expand option.
Related Concepts Requirements Management Related Procedures Managing Requirements - Quick Start Task Requirements Tree Managing Requirements Related Reference Requirements Unit Interface
377
Click Requirements on the workflow bar. Click Full/Direct Coverage on the toolbar to switch to the alternative view. Click Full/Direct Coverage again to return to the previous view.
Related Concepts Requirements Management Related Procedures Managing Requirements - Quick Start Task Full Coverage and Direct Coverage Modes Managing Requirements Related Reference Requirements Unit Interface
378
379
380
Click Test Plan on the workflow bar. Select the test definition to which you are assigning requirements. In Test Plan View, select the Assigned Requirements tab. All requirements that are available for assignment are displayed in the Available Requirements window. Note: The Available Requirements window can be broadened or narrowed by dragging the window splitter (the left-hand edge of the window) to the left or right.
Click the arrow of any requirement to assign it to the currently selected test definition. Note: Newly generated test definitions can automatically be assigned to the requirements from which they are generated by selecting the Assign newly generated test definitions to requirements on the Generate Test Plans from Requirements dialog box (the default behavior).
Related Concepts Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Working with Data-Driven Tests Managing Test Plans Generating Test Plans from Requirements View Related Reference Test Plan Assigned Requirements tab
381
Click Test Plan on the workflow bar. Select a test definition. Select the Assigned Requirements tab. In the Actions column of a requirement, click the requirement is stored in. to find out in which node in the Available Requirements tree
The corresponding parent-requirement node is then expanded and the assigned requirement is highlighted.
Related Concepts Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Generating Test Plans from Requirements View Related Reference Test Plan Assigned Requirements tab
382
Click Test Plan on the workflow bar. Select a test definition (in the Test Plan tree) that has at least one requirement assigned to it. Select the Assigned Requirements tab. In the Actions column, click the delete button of the assigned requirement. Click Yes on the confirmation dialog box to confirm deletion of the assignment. Note: To remove all requirement assignments from the selected test definition, click Remove All.
Related Concepts Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Generating Test Plans from Requirements View Related Reference Test Plan Assigned Requirements tab
383
Sorting Requirements
To sort requirements on the Assigned Requirements tab
1
Click the column header of the property by which you want to sort the requirements. A small upward or downward pointing arrow indicates both which column the sort is based and the direction of the sort (ascending or descending). If required, click the column header again to reverse the direction of the sort.
Related Concepts Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Generating Test Plans from Requirements View Related Reference Test Plan Assigned Requirements tab
384
385
Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select the test definition to which you are assigning an attribute. Select the Attributes tab. Click Add Attribute to display the Add Attributes dialog box. Click the plus symbol (+) of the attribute that you are assigning. Based on the attribute type you have selected (set or normal) you will be presented with an Edit Attribute dialog box, which allows you to specify which of the available attribute values youd like to assign to the test definition. Select the value required and click OK to assign the attribute. Note: A Set type attribute allows you to assign a set of values to an attribute. A Normal type attribute allows you to assign only a single value.
Related Concepts Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Generating Test Plans from Requirements View Related Reference Test Plan Attributes tab
386
Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select the test definition for which you wish to delete an assigned attribute. Select the Attributes tab. Click the delete icon of the attribute you are deleting. The Delete Attribute confirmation dialog box displays. Click Yes to delete the attribute. Note: Inherited attributes cannot be deleted.
Related Concepts Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Generating Test Plans from Requirements View Related Reference Test Plan Attributes tab
387
Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select the test definition for which you are editing an assigned attribute. Select the Attributes tab. Click the Edit Attribute button of the attribute you are editing. The Edit Attribute dialog box displays (options available on the Edit Attribute dialog box vary depending on the attribute type that you have selected). Select the required value and click OK to save your settings.
Related Concepts Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Generating Test Plans from Requirements View Related Reference Test Plan Attributes tab
388
389
Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select the test definition node for which you are editing an existing parameter. Select the Parameters tab. In the parameter you want to edit, click Edit. The Add Custom Parameter dialog box displays. Edit the parameter values as required. Note: Inherited parameters cannot be edited. Uncheck the Inherit from parent check box to enable editing of the parameters Value setting. Parameter Name and Type settings cannot be edited.
Related Concepts Test Definition Parameters Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Generating Test Plans from Requirements View Related Reference Test Plan Parameters tab
390
Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select the test definition node for which you are adding a predefined parameter. Select the Parameters tab. Click Add Predefined Parameter to display the Add Predefined Parameter dialog box, which lists all of the project attributes that are available in the project file. Note: The Add Predefined Parameter button is only available for SilkPerformer test definitions for which the Project property has already been defined.
6 7 8
To add any of the listed parameters, click the corresponding add icon. On the dialog box that displays, specify the actual value for the parameter. Click Save to add the parameter to the active Test Plan tree node.
Related Concepts Test Definition Parameters Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Generating Test Plans from Requirements View Related Reference Test Plan Parameters tab
391
Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select the test definition node for which you are clearing the assignment of an existing parameter. Select the Parameters tab. Click the clear button that corresponds to the parameter that is being cleared. Note: Inherited parameters cannot be cleared. Uncheck the Inherit from parent check box on the Set Parameter dialog box to enable clearing of a parameter.
Click Yes on the Clear Parameter dialog box to clear the parameter.
Related Concepts Test Definition Parameters Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Generating Test Plans from Requirements View Related Reference Test Plan Parameters tab
392
On the Test Definition dialog box, select SilkTest plan from the Type list box and then click Next. The SilkTest Plan Properties dialog box opens. In the Plan File text box, type the fully qualified name of the test plan file to be executed. Click Browse to browse for the file. In the SilkTest Project File text box, type the name of the SilkTest containing the file and environmental settings. Click Browse to browse for the project file. In the Option Set text box, type the fully qualified name of the option set file containing environmental settings. Click Browse to browse for the option set file. In the Data file for attributes and queries text box, type the default path of the test plan initialization file. Click Browse to browse for the test plan initialization file. In the Test plan query name text box, type the fully qualified name of the saved test plan query. Click Finish.
2 3 4 5 6 7
Related Concepts Test Definition Parameters Related Procedures Creating Test Definitions Editing Test Definitions
393
On the Test Definition dialog box, select .NET Explorer Test from the Type list box and then click Next. The .NET Explorer Test Properties dialog box opens. Browse to and select the .NET Explorer script to apply to the test definition (.nef file). Browse to and select the executable that executes the selected script file (NetExplorer.exe), such as C: \Program Files\MyCustomSPFolder\DotNET Explorer\NetExplorer.exe. In the Test case text box, type the name of the .NET Explorer script to execute. If this text box is left blank, all test cases within the script are executed. Note: The test cases InitTestCase and EndTestCase are always executed.
2 3 4
Click Finish.
Related Concepts Test Definition Parameters Related Procedures Creating Test Definitions Editing Test Definitions
394
On the Test Definition dialog box, select JUnit Test from the Type list box and then click Next. The JUnit Test Properties dialog box opens. In the Test class text box, type the fully qualified name of the JUnit test class. In the Test method text box, type the name of the appropriate test method. The method must be available in the test class. If the Test method text box is left blank, all tests that are included in the suite will be executed.
2 3
Set the Java home directory to the installation path of the Java Runtime Environment (JRE). The path must be valid on the execution server on which the test definition runs. Specify a valid Java Classpath to use on the execution server. Borland recommends to use a relative classpath. The relative classpath is then expanded to the full classpath on the execution server. By using a relative classpath, changes on the location of the source control profile do not require additional changes to the classpath. The following example shows usage of the relative classpath: The relative classpath must point to the root node of the test container containing the JUnit test definition, for example JUnit_tests. The relative classpath on the execution server is then expanded to include the source control profile's working folder, for example C:\temp, and the test file names, for example JUnit4Test.jar. The relative classpath to junit.jar must also be added to the classpath, with the appropriate JUnit version, as the following example shows: The specified relative paths junit-4.4.jar;JUnit4Test.jar are expanded to C:\temp\JUnit_tests \junit-4.4.jar;C:\temp\JUnit_tests\JUnit4Test.jar on the execution server. You can also use a fully qualified classpath. The fully qualified classpath must point to the archive or folder in which the test classes reside. Further, junit.jar must be added to the classpath, with the appropriate JUnit version, as the following examples show:
C:\Java\junit3.8.1\junit.jar;C:\MyApps\main.jar;C:\MyApps\utils.jar ${junit_home}\junit.jar;${apps_home}\main.jar;${apps_home}\utils.jar
6
In the Coverage path text box, type the JAR libraries or the specific class files to monitor for code coverage information. Borland recommends using the relative coverage path from the test container root node, which is then expanded on the execution server. You can also use a fully qualified path. Use semicolons to separate multiple jar files, as the following examples show:
C:\MyApps\main.jar;C:\MyApps\utils.jar ${apps_home}\main.jar;${apps_home}\utils.jar
Note:
7
The coverage path setting is disregarded if the Record external AUT Coverage check box is checked.
Check the Record external AUT Coverage check box to get code coverage for the application under test that is defined for the execution definition in the Code Analysis Settings portion of Execution Deployment . If the check box is not checked, code coverage is recorded from the executing virtual machine. The check box is by default not checked.
395
Click Finish.
Note: Parameters are passed to the Java process as system properties, for example Dhost_under_test=10.5.2.133. Use the System.getProperty() method to access the system properties. For example, to access the previously passed host_under_test property, use System.getProperty(host_under_test);. Related Concepts Test Definition Parameters Related Procedures Creating Test Definitions Editing Test Definitions
396
On the Test Definition dialog box, select Manual Test from the Type list box. In the Planned time text box, type the expected amount of time for this manual step to execute and then click Next. The Add Manual Test Definition Step dialog box displays. Note: Manual test steps are automatically timed in seconds from the moment you begin execution. These values are available in Detail view, not Step-by-Step view.
Specify a name, an action description, and the expected results for the first step of the manual test. Note: Test Manager supports HTML formatting as well as the cutting and pasting of HTML content for text boxes.
4 5
Click OK. Optional: Click New Step to add additional steps to your manual test.
Related Concepts Manual Tests Manual Test Definitions Test Definition Parameters Related Procedures Working with Manual Tests Creating Test Definitions Editing Test Definitions
397
On the Test Definition dialog box, select NUnit Test from the Type list box and then click Next. The NUnit Properties dialog box displays. Click Browse to locate and select the NUnit assembly from which you want to pull a test definition. Type the working directory in the NUnit Directory text box. This directory is the local path to the file nunit-console.exe, such as C:\Program Files\NUnit 2.2 \bin. Note: If only one version of NUnit is installed on your computer, you can leave the NUnit Directory text box blank. If multiple versions are installed, you must provide a valid path.
2 3
Click Finish.
Related Concepts Test Definition Parameters Related Procedures Creating Test Definitions Editing Test Definitions
398
On the Test Definition dialog box, select SilkPerformer Test from the Type list box and then click Next. The SilkPerformer Test Properties - Select Project dialog box opens. Perform one of the following steps to define the SilkPerformer project from which your test case is taken:
Click Browse to select a SilkPerformer project that has been saved to your local file system. Click Import to import a SilkPerformer project that is saved to the file pool. A file in the file pool can be
used anywhere in the Test Plan tree. On the Import Project From File Pool dialog box, either select a saved SilkPerformer project package (.ltz) from the File pool entry list box or click Browse to select a SilkPerformer project package that has been saved to the source-control system. If you check the Remove file from file pool check box before you click Finish, the selected SilkPerformer file is deleted from the file pool. Perform one of the following steps to upload SilkPerformer projects to the file pool:
From the Administration module, click Upload and browse to the appropriate LTZ file. For more
information, refer to the SilkCentral Administration Module documentation.
Use the upload mechanism offered by SilkPerformer. Use the Upload Manager in SilkCentral. Use an existing project directory by way of a UNC path. (Create a new test definition, click Browse to
select the appropriate LTP file, and then select a workload.)
3 4 5
On the SilkPerformer Test Properties - Select Project dialog box, click Next. On the SilkPerformer Test Properties - Select Workload dialog box, select one of the workload profiles that has been defined for the project from the Workload list box. Click Finish to create the test case. Test Manager is fully integrated with SilkPerformer.
Related Concepts Working With SilkPerformer Projects Test Definition Parameters Related Procedures Creating Test Definitions Editing Test Definitions
399
On the SilkTest Test Properties - Select Test Script dialog box, click Browse and select the test script file from either the defined SilkTest project or the source control directory. Express the source control directory as a relative path to the root node defined in the test container. Click Next. The SilkTest Test Properties - Select Testcase dialog box opens. Note: If the SilkTest script is a data-driven .g.t file, for example SilkTestScript1.g.t, then data sources are completely controlled within the script file and not through Test Manager's data-driven properties. The Data-driven check box is checked by default when you use a data-driven script file. For more information about data-driven SilkTest tests, refer to the SilkTest documentation.
3 4 5
Select a test case from the available test cases in the defined script file or specify a custom test case. If required, specify an option set file. Click Finish to create the SilkTest test definition. Note: If you possess SilkTest test cases that require more than one hour to complete, adjust Test Manager's time-out settings. Otherwise, Test Manager assumes an error has occurred and terminates the execution. For details about time-out settings, refer to the SilkCentral Administration Module documentation. You can use the Test Properties - Select Test Script dialog box to import multiple test cases. To access the Test Properties - Select Test Script dialog box from the Test Definition dialog box, select SilkTest Multi-testcase import from the Type list box and click Next. Follow the steps described above to complete the task.
Related Concepts SilkTest Test Definitions Test Definition Parameters Related Procedures Creating Test Definitions Editing Test Definitions
400
From the Windows Scripting Properties dialog box, click Browse and select a Windows scripting test script. Specify the location of any required additional parameters in the Switches field. Note: You may add other switches to be passed to the script. For more details on the switches that can be used, see the Windows Script Host Tests topic and consult MS Scripting Host documentation.
Click Finish.
Related Concepts Windows Script Host Tests Test Definition Parameters Related Procedures Creating Test Definitions Editing Test Definitions
401
ClickTest Plan on the workflow bar. Click Test Plan View on the toolbar. Select the test definition node for which you are creating a new parameter. Select the Parameters tab. Click Add Custom Parameter to display the Add Custom Parameter dialog box. Provide a name for the parameter. Select the parameter type (String, Number, Float, Boolean, Password, or Character). Define the parameter value that is to be assigned to the selected test definition. Note: Values for parameters of type String must be set in quotation marks () if you want to use the parameter in SilkTest executions.
Click OK. The parameter now displays in the Parameters list for the selected node. Note: Parameters are automatically assigned to all sub-folders and child test definitions of the nodes to which theyve been assigned.
Related Concepts Test Definition Parameters Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Generating Test Plans from Requirements View Related Reference Test Plan Parameters tab
402
403
Run the test definition once to create the output.xml file, which contains the structure of the test package. In the Test Plan tree, right-click the name of the test definition and choose Convert to Test Package. The selected test definition is converted to a hierarchy representing the structure of the last execution result.
404
Click Test Plan on the workflow bar. Select a container or folder node in the Test Plan tree where you want to insert a new test definition. Click New Test Definition on the toolbar or right-click within the tree and choose New Test Definition. A new test definition node is appended to the tree view, and the Test Definition dialog box opens. Specify a name and meaningful description for the test definition. Note: Test Manager supports HTML formatting as well as the cutting and pasting of HTML content for text boxes.
Select one of the following test definitions from the Type list box:
SilkTest test SilkPerformer test Manual test SilkTest Multi-testcase import NUnit test Windows scripting test JUnit test SilkTest plan
6
If you are configuring a SilkTest test, proceed to Configuring a SilkTest Test. If you are configuring a SilkPerformer test, proceed to Configuring a SilkPerformer Test. If you are configuring a manual test, proceed to Configuring a Manual Test. If you are configuring a SilkTest multi-testcase import, proceed to Configuring SilkTest Multi-Testcase
Import.
If you are configuring a NUnit test, proceed to Configuring an NUnit Test. If you are configuring a Windows scripting test, proceed to Configuring a Windows Scripting Test. If you are configuring a JUnit test, proceed to Configuring a JUnit Test. If you are configuring a SilkTest plan test, proceed to Configuring a SilkTest plan Test. If you are configuring a .NET Explorer test, proceed to Configuring a .NET Explorer Test.
Note: Test Manager's well-defined public API allows you to implement a proprietary solution that meets your automated test needs. Test Manager is open and extensible to any external tool that can be invoked from a Java implementation or through a command-line call.
405
Note:
Throughout the test-definition configuration process and across all test definition types, Inherit from parent check box options are provided where applicable, enabling you to accept settings of any existing parent entity.
Related Concepts Upload Manager Test Plan Management Test Definition Parameters Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Configuring SilkTest Test Properties Configuring SilkPerformer Test Properties Configuring Manual Test Properties Configuring JUnit Test Properties Configuring SilkTest Plan Properties Configuring NUnit Test Properties Configuring Windows Scripting Test Properties Configuring .Net Explorer Test Properties Editing Test Definitions Related Reference Test Plan Unit Interface APIs HTML Support for Description Text Boxes
406
Click Test Plan on the workflow bar. Select the test definition or the test package that you want to edit. Note: Test Manager supports HTML formatting as well as the cutting and pasting of HTML content for text boxes.
Click Edit on the toolbar or under the General Properties section in the tab view. The Edit Test Definition dialog box displays. Specify the name and description of the selected test definition. If the selected test definition is a test package, the Update Package Structure on Result check box is available. Check the Update Package Structure on Result check box if you want to update the structure of the test package according to the results of the test execution run.
Configure the properties of the test definition or the test package according to the test definition type.
Related Concepts Upload Manager Test Plan Management Test Definition Parameters Test Packages Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Configuring Test Definition Parameters Related Reference Test Plan Unit Interface APIs HTML Support for Description Text Boxes
407
Click Test Plan on the workflow bar. Right-click a test definition that you want to try out in the Test Plan tree. Select Try Run Test Definition . The Go To Activities dialog box displays. Click Yes if you want to view the Activities page (see also Activities Overview), or click No if you want to remain on the current Web page. Note: Check the Don't show this dialog again (during this login session) check box if you dont want to be asked about switching to the Activities page again in the future. Note that this setting will be discarded when you log out of Test Manager. The test definition is executed as soon as you perform step 1. You can analyze the results on the Activities page (Test Manager/Projects/Activities). See Activities Overview for detailed information about the Activities page.
Note:
Related Concepts Manual Tests Manual Test Definitions Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Working with Manual Tests Managing Test Plans Related Reference Test Plan Unit Interface
408
409
Create a test plan in SilkTest. See SilkTest documentation for details. From the SilkTest Testplan menu, select Upload to Test Manager . The SilkCentral Administration Module Login screen displays. Enter your user name and password. From the Project list box on the Upload Testplan file to Test Manager dialog box, select the Test Manager project to which you are uploading the file. Click OK. Click OK on the Upload Testplan Complete confirmation dialog box. Open the Test Manager Test Plan unit. You will see the uploaded project listed as a test container in the Test Plan tree with the same name as the imported SilkTest test plan. Note: To work with the new test container, you may have to edit source control profile settings or other settings.
7 8
To edit the test container, select the container in the Test Plan tree. Click Edit on the Properties tab to open the Edit Test Container dialog box. Edit the criteria for the test container as required. Note: You can find the inherited SilkTest symbols on the Parameters tab in the Test Plan View. Inherited test definition attributes can be found on the Attributes tab in both Test Plan View and in the Settings module. Inherited SilkTest queries can be found on the Filters tab in the Settings unit.
Related Concepts SilkTest Test Plans Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Generating Test Plans from Requirements View Related Reference Test Plan Unit Interface
410
411
Click Test Plan on the workflow bar. Right-click a node in the Test Plan tree menu where you want to have a linked test container appear. Choose New Link if you want to link a test container at the hierarchy level of the selected node, or choose New Child Link to link a test container a hierarchy level below the selected node. The Select Test Container For Linking dialog box displays where you can select the test container you want to link to the selected test container. Click OK to confirm your selection. Note: If the target test container and the container to link have different Source Control values, a confirmation dialog box displays, asking you if you really want to create the link. Linking a test container with differing source control values can lead to problems when downloading or executing a test definition within the linked container. Click No if you want to change the Custom include directory of the target or of the linked container first, or Click Yes to create the link anyway.
The linked container will be placed within the selected container as read-only entity. Any changes to the original test container will be reflected in the linked container.
Related Concepts Test Plan Tree Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Related Reference Test Plan Toolbar Functions
412
Click Test Plan on the workflow bar. Click New Test Container on the toolbar (or right-click within the tree menu and choose New Test Container). A new container root node will be appended to the tree menu and the New Test Container dialog box will display.
Define a Name (or accept the default name) and a meaningful Description for the container. Note: Test Manager supports HTML formatting and cutting and pasting of HTML content for Description fields.
3 4
Select any pre-defined Product that is to be associated with this test container from the list box. See SilkCentral Administration Module Help for details regarding adding product profiles. To configure source control settings for this container, select a pre-defined source-control profile from the Source Control profile list box. Note: Defining source control profiles allows you to define where Test Managers execution servers should retrieve program sources for test execution. See SilkCentral Administration Module Help for details regarding source control settings.
Check the Clear working folder before each test execution check box to have the source control profile working folder cleared before each test execution is performed (for example, the sources will be checked out before each execution). This check box is not checked by default. Note: If you use an external source control system, consider that using the Clear working folder before each test execution option in conjunction with MS VSS can lead to longer wait times than when used in conjunction with CVS or Subversion.
To specify the default root path where the container is to be saved, click Browse... and navigate to the location. Note: The Custom Data Directory and Custom Include Directory fields facilitate the integration of Test Manager with functionality available with SilkPerformer 7.1 or higher. In SilkPerformer, the Include directory is divided into a System Include directory and a Custom Include directory; the Data directory is divided into a System Data directory and a Custom Data directory. See SilkPerformer documentation for details.
The Hidden Test Properties portion of the dialog box allows you to specify the test property types that are to be displayed on the test containers Properties tab (and the Properties tab of all test folders within the container). These settings do not affect the display of individual test definitions. To adjust hidden test property settings: Click the Edit button associated with the Hidden Test Properties field. On the Hidden Test Properties dialog box, uncheck the check boxes of all test types for which you want to have properties displayed (SilkPerformer, SilkTest, NUnit, Windows Scripting, JUnit, and .NET Explorer). Click OK to save your settings. Check the Use SilkTest interface to launch tests check box to specify that the SilkTest interface be used to open SilkTest in the execution of tests (rather than the command line). Note: This setting supports execution of tests created with Test Manager 8.0 or higher. When aTest Manager installation is updated to version 8.0 or higher, this check box is not checked for existing test containers. When a new test container is created, the check box is checked by default. It is not recommended that you check this option for test definitions created with versions of Test Manager earlier than 8.0.
413
Click Save to save your settings and update the tree view with the new container.
Related Concepts Test Plan Tree Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Related Reference Test Plan Toolbar Functions HTML Support for Description Text Boxes
414
Click Test Plan on the workflow bar. Select an existing container or folder node in the Test Plan tree menu where you want to insert a new test folder. Click New Test Folder on the toolbar (or right-click within the tree and choose New Test Folder ). A new folder node is appended to the tree view and the New Test Folder dialog box displays. Provide a name and meaningful description for the folder. Note: Test Manager supports HTML formatting and cutting and pasting of HTML content for Description fields.
Click OK to save your settings and update the tree view with the new test container.
Related Concepts Test Plan Tree Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Related Reference Test Plan Toolbar Functions HTML Support for Description Text Boxes
415
Click Test Plan on the workflow bar. Select the test plan element (container, folder, or test definition) in the tree view to which the edit is to be applied. Click the appropriate toolbar button: Note: Note that these commands are also available through context menus in the Test Plan tree.
Delete Deletes the selected element from the tree. Cut Cuts the selected element from the tree and moves it to the clipboard. Copy Copies the selected element to the clipboard (containers cannot be copied). Paste Pastes a copy of the element held on the clipboard to the same level of the currently selected
element (containers cannot be pasted).
Cut or copy the element to the clipboard. Select the destination project through Test Manager/Projects. Select the destination container and/or folder. Click Paste.
You can easily reorder test containers, folders, and test definitions that are listed in the Test Plan tree through and on the toolbar.
Select the test definition that you want to move up or down in the tree. Click to move the test definition up one step in the Test Plan tree; Click down one step in the Test Plan tree. to move the test definition
416
Note:
This process also applies to changing the order of listed test containers and test folders.
Related Concepts Test Plan Tree Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Related Reference Test Plan Toolbar Functions HTML Support for Description Text Boxes Multi-Select Functionality for Test Plan Elements Test Plan Contents Tab
417
Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select a test-container node, test-folder node, or SilkTest test-definition node. Select the Properties tab. In the SilkTest Test Properties area, click Edit (or click Edit on the toolbar) and proceed with the Creating Test Definitions procedure.
Related Concepts SilkTest Test Definitions SilkTest Tests Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Editing Test Definitions Managing Test Plans Related Reference Test Plan Unit Interface
418
Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select a test-container node, test-folder node, or SilkPerformer test-definition node. Select the Properties tab. In the SilkPerformer Test Properties area, click Edit (or click Edit on the toolbar) and proceed with the Creating Test Definitions procedure.
Related Concepts Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Working with SilkPerformer Projects Editing Test Definitions Managing Test Plans Related Reference Test Plan Unit Interface
419
Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select a test-container node, test-folder node, or JUnit test-definition node. Select the Properties tab. In the JUnit Test Properties area, click Edit (or click Edit on the toolbar), and proceed with the Creating Test Definitions procedure. Note: Manual test types do not have properties associated with them.
Related Concepts Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Editing Test Definitions Managing Test Plans Creating Test Definitions Related Reference Test Plan Unit Interface
420
Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select a test-container node, test-folder node, or NUnit test-definition node. Select the Properties tab. In the NUnit Test Properties area, click Edit (or click Edit on the toolbar) and proceed with the Creating Test Definitions procedure.
Related Concepts Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Editing Test Definitions Managing Test Plans Related Reference Test Plan Unit Interface
421
Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select a test-container node, test-folder node, or test-definition node. Select the Properties tab. Click Edit to display the Edit Success Conditions dialog box. Uncheck the Inherit from parent check box of any success condition you are editing. Edit values as required. Specify if conditions should be active or inactive by checking or unchecking their Active check boxes. Click OK to save your settings.
Related Concepts Test Plan Tree Test Plan Management Success Conditions Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Related Reference Test Plan Properties tab
422
Click Test Plan on the Workflow Bar. Click Test Plan View on the toolbar. Select a test-container node, test-folder node, or WSH test-definition node. Select the Properties tab in Test Plan View. In the WSH Test Properties area, click Edit (or click Edit on the toolbar) and proceed with the Creating Test Definitions procedure.
Related Concepts Windows Script Host Tests Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Editing Test Definitions Managing Test Plans Related Reference Test Plan Unit Interface
423
Click Test Plan on the workflow bar. Click Find (binoculars icon) on the toolbar to open the Find dialog box. From the Category list box, select the functional category or Test Manager plug-in across which you want to search:
General test definition properties Manual steps Test definition parameters Test definition attributes SilkTest test properties SilkPerformer test properties .NET Explorer test properties JUnit test properties NUnit test properties Windows Scripting test properties Custom plug-in properties
4 5
From the Find in list box, specify the property within which the query should search for the value. The properties available in this list vary based on the selected category. In the Find what portion of the dialog box, enter an alphanumeric string to be submitted for the query. Optional settings are available for qualifying the query further. Check the check boxes of those that are appropriate:
Start from selection: Specifies that the search begin from the currently selected test plan element. Start from top: Specifies that the search begin from the root of the Test Plan tree. Find in subtree only: Specifies that the search only be run in the currently selected segment of the
Test Plan tree (the portion of the Test Plan tree that is available on the Contents tab.
Case sensitive: Specifies that the string be searched case-sensitively. Match whole word only: Specifies that search results only include complete standalone instances of
the query string.
424
Include read-only values: Specifies that search results include text strings that can not be directly
edited because they are inherited from another test definition, referenced from a linked test container, or called from a data source in the course of data-driven testing. Note: When using a case sensitive SQL Server, case-insensitive searching is not supported for the following fields: test definition description, manual step description, manual step action description, and manual step expected results.
6 7
Click Find to begin the search and advance to the first test plan element returned by the query (test container, test folder, or test definition). If your query returns multiple test plan elements, you will be presented with the option to advance through the elements using the following buttons on the Find menu:
Next: Advances the view to the next returned element. Previous: Advances the view to the last viewed element. First: Advances the view to the first returned element. Last: Advances the view to the last returned element. New Find: Cancels the current search and returns the view to the Find dialog box. Close: Closes the Find dialog box.
Note: The Find command allows you to search test plan elements where the search string is an inherited value. This option is not allowed with the Replace command.
Click Test Plan on the workflow bar. Click Replace on the toolbar to open the Replace dialog box. From the Category list box, select the functional category or Test Manager plug-in across which you want to search:
General test definition properties Manual steps Test definition parameters Test definition attributes SilkTest test properties SilkPerformer test properties .NET Explorer test properties JUnit test properties NUnit test properties Windows Scripting test properties Custom plug-in properties
425
4 5
From the Find in list box, specify the property within which the query should search for the value. The properties available in this list vary based on the selected category. In the Find what portion of the dialog box, enter an alphanumeric string to be submitted for the query. Optional settings are available for qualifying the query further. Check the check boxes of those that are appropriate:
Start from selection: Specifies that the search begin from the currently selected test plan element. Start from top: Specifies that the search begin from the root of the Test Plan tree. Find in subtree only: Specifies that the search only be run in the currently selected segment of the
Test Plan tree (the portion of the Test Plan tree that is available on the Contents tab.
Case sensitive: Specifies that the string be searched case-sensitively. Match whole word only: Specifies that search results only include complete standalone instances of
the query string. Note: When using a case sensitive SQL Server, case-insensitive find/replace is not supported for the following fields: test definition description, manual step description, manual step action description, and manual step expected results.
6 7
In the Replace with text box, enter the alphanumeric string that is to replace instances of the queried string. Click Find to begin the search and advance to the first test plan element returned by the query (test container, test folder, or test definition). Or click Replace all to replace all instances of the queried string with the replacement string. If you select Find and the query returns multiple test plan elements, you will be presented with the option to advance through the elements using the following buttons on the Replace dialog box:
Find Next: Advances the view to the next returned element. Find Previous: Advances the view to the last viewed element. Replace: Replace the displayed instance of the queried string with the replacement string. Replace All: Replace all instances of the queried string with the replacement string. Close: Closes the Replace dialog box.
Warning: Data-driven settings and properties cannot be replaced. Related Concepts Test Plan Tree Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Related Reference Test Plan Toolbar Functions
426
Click Test Plan on the workflow bar. Select the test container that you are editing. Select the Properties tab in Test Plan View. Beneath the containers property fields, click Edit to open the Edit Test Container dialog box. You can also click Edit on the toolbar to open the Edit Test Container dialog box. Add a new test container. Click OK to accept your changes.
3 4 5
Related Concepts Test Plan Tree Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Adding Test Containers Managing Test Plans Related Reference Test Plan Toolbar Functions
427
Click Test Plan on the workflow bar. Select the test folder that you want to edit. Select the Properties tab in Test Plan View. Beneath the folder Name/Description details, click Edit to open the Edit Test Folder dialog box. You can also click Edit on the toolbar to open the Edit Test Folder dialog box. Edit the name and description of the folder as required. Click OK to accept your changes.
Related Concepts Test Plan Tree Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Adding Test Folders Managing Test Plans Related Reference Test Plan Toolbar Functions
428
Set a Test Plan Node as Integration Default for External Agile Planning Tools
To use the Web service calls to create tests in Test Manager through an external agile planning tool, you have to set a folder or container in the test plan tree as the integration default node, where the Web service will create the test. If you do not specify the integration default node, an error message box displays.
Click Test Plan on the workflow bar. Right-click the folder or container in the test plan tree which you want to set as the integration default node. Choose Set as Integration Default. Note: If an integration default node already exists, the default node is changed to the new node.
The integration default node is set to the selected node, enabling the agile planning tool to create tests at this location. Note: The integration default node is shown in the Properties page of the project, in which the node is located.
Related Concepts Test Plan Tree Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Related Reference Test Plan Toolbar Functions
429
430
Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select the element for which you are deleting an attachment. Select the Attachments tab to see a list of all attachments that are associated with the element. Click the delete icon of the attachment you want to delete. Click Yes on the confirmation dialog box to delete the attachment from the project. Note: Only one attachment at a time can be deleted.
Related Concepts Attachments Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Working with Attachments Managing Test Plans Related Reference Test Plan Attachments tab
431
Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select a container, folder, or test definition. Select the Attachments tab. Click Upload File to open the Upload File dialog box. Using Browse, select a file from your local file system. Enter a meaningful description for the attachment. Click Upload File to upload the attachment to the server and associate it with the selected element.
Note: Attaching files to a test plan element may not work in Mozilla Firefox. Firefox requires usage of three slashes (for example: "file:///") for a file link, while other browsers require only two (for example: "file://"). Aditionally, Firefox includes a security feature blocking links from remote files to local files and directories. For more information, see http://kb.mozillazine.org/Firefox_:_Issues_:_Links_to_Local_Pages_Don't_Work Related Concepts Attachments Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Working with Attachments Managing Test Plans Related Reference Test Plan Attachments tab
432
Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select a container, folder, or test definition. Select the Attachments tab. Click Attach Link to open the Attach URL dialog box. Enter a URL in the URL field. Enter a meaningful description for the attached link. Click Attach URL to associate the link with the selected element.
Related Concepts Attachments Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Working with Attachments Managing Test Plans Related Reference Test Plan Attachments tab
433
Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select the element for which you want to edit an attachment description. Select the Attachments tab to see a list of all attachments that are associated with the element. Click the edit icon of the attachment for which you want to edit the description. Edit the description on the Edit File Attachment dialog box. Click OK to save your changes.
Related Concepts Attachments Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Working with Attachments Managing Test Plans Related Reference Test Plan Attachments tab
434
From Test Plan View, select the element for which you want to view an attachment. Select the Attachments tab to see a list of all attachments that are associated with the element. Each attachment name serves as a link. File-attachment links open a Save As dialog box, enabling you to download the attachment to your local file system. Link-attachments link directly to the link destinations in a new browser window.
Related Concepts Attachments Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Working with Attachments Managing Test Plans Related Reference Test Plan Attachments tab
435
436
Click Test Plan on the workflow bar. Create a new data-driven test definition (select Manual as the test type and configure test steps). Note: To view the values included in your data source, click the Data Set tab of your test definition.
3 4 5
Select the Steps tab of your test definition. Click the Edit Test Step icon in the Actions column of the test step that is to reference the data source value. In the Action description text box, enter a parameter that references the relevant column in your data source, using the syntax ${<column name>}. For example, if you want a test step to retrieve password parameters from a spreadsheet that has a column called Password, you would write the parameter as ${Password}. When you execute the manual test step, the parameter is replaced by an actual value in the corresponding data-driven data source.
Related Concepts Manual Tests Manual Test Definitions Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Creating Data-Driven Test Definitions Working with Manual Tests Working with Data-Driven Tests Working with Manual Tests Managing Test Plans Related Reference Test Plan Unit Interface
437
Click Test Plan on the workflow bar. Create a new test definition. See the topic, Creating a Test Definition for information about creating a test definition.
Select the newly created test definition's Properties tab. Scroll down to the Data-driven Properties section of the Properties tab and select the Edit icon to open the Data-driven Properties dialog box. Select a pre-configured data source from the Data Source list box. See SilkCentral Administration Module documentation for information on configuring data sources. Click Next to continue. Select a data set from the Data Set list box (in the case of Excel data sources, this is a worksheet name. In the case of database data sources, this is a table name). Check the Each data row is a single test definition check box to have each row in your data set considered to be a separate test definition, or do not check this check box to create a single test definition for all data rows of your data set. (optional) You can enter a SQL query into the Filter query field to filter your data set based on a SQL-syntax query. Note: Only simple WHERE clause queries are supported.
3 4 5 6
8 9
Check the Enable data-driven properties check box to enable data-driven functionality. Click Finish to save your settings. Note: Note: Data-driven property settings are visible in the lower portion of each test definitions Properties tab. To use Test Manager's data-driven test functionality with SilkPerformer scripts, data sources with column names matching the corresponding SilkPerformer project attributes must be used in conjunction with "AttributeGet" methods.
Related Concepts Manual Tests SilkTest Test Definitions Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Working with Data-Driven Tests Managing Test Plans Related Reference Test Plan Data Set tab
438
Click Test Plan on the workflow bar. Select a test definition that relies on the data source from which you want to download data. Select the Properties tab. Click the Download button (in the Actions column) of either the data source or the data set (depending on which entity contains the data you want to download). Specify the location on your local system to where the data is to be downloaded. Click OK to download the data in CSV format.
Related Concepts Manual Tests Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Creating Data-Driven Test Definitions Working with Data-Driven Tests Managing Test Plans Related Reference Test Plan Data Set tab
439
Click Test Plan on the workflow bar. Select the test definition that has the property you want to edit. Select the Properties tab. Select the Edit icon that corresponds to the property you are editing (in the Actions column). Edit the property as required. Click OK to save your changes.
Related Concepts SilkTest Test Definitions Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Creating Data-Driven Test Definitions Working with Data-Driven Tests Managing Test Plans Related Reference Test Plan Properties tab
440
441
Click Test Plan on the workflow bar. Right-click a manual test definition in the Test Plan tree and select Automate with... . Select one of the following test types from the list:
SilkTest Test SilkPerformer Test NUnit Test Windows Scripting Test JUnit Test SilkTest Plan .NET Explorer Test ProcessExecutor Test
Depending on the test type you select, the appropriate properties dialog box opens.
3
Proceed to the appropriate topic in Help for information on filling out the dialog:
If you are converting to a SilkTest test, proceed to Configuring a SilkTest Test. If you are converting to a SilkPerformer test, proceed to Configuring a SilkPerformer Test. If you are converting to a NUnit test, proceed to Configuring an NUnit Test. If you are converting to a Windows scripting test, proceed to Configuring a Windows Scripting Test. If you are converting to a JUnit test, proceed to Configuring a JUnit Test. If you are converting to a SilkTest plan test, proceed to Configuring a SilkTest plan Test. If you are converting to a .NET Explorer test, proceed to Configuring a .NET Explorer Test.
Related Procedures Managing Test Plans - Quick Start Task Configuring SilkTest Test Properties Configuring SilkPerformer Test Properties Configuring NUnit Test Properties Configuring Windows Scripting Test Properties Configuring JUnit Test Properties Configuring SilkTest Plan Properties Configuring .Net Explorer Test Properties
442
Within Test Manager, click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select the node of the manual test for which you are editing a test step. Select the Steps tab. Do one of the following to open the Edit Manual Test Definition Step dialog box:
Press F2. Press ALT and double-click the step you want to edit. In the Actions column of the step you want to edit, click Edit Test Step.
6
Edit step details as required. Note: Note: Values from data sources can be inserted into manual test steps in the form of parameters. Test Manager supports HTML formatting and cutting and pasting of HTML content for Description fields.
Related Concepts Manual Tests Test Definitions in the Manual Testing Client Test Plan Management Manual Testing Client Related Procedures Managing Test Plans - Quick Start Task Creating Test Definitions Adding a Data Source Value to a Manual Test Step Working with Manual Tests Managing Test Plans Related Reference Current Run Page HTML Support for Description Text Boxes Multi-Select Functionality for Test Plan Elements
443
444
Click Test Plan on the workflow bar. Click Grid View on the toolbar to display Grid View. Select the test definitions you want to assign to your execution definition, by using the multi-select feature of the Grid View. Right-click the test definitions and choose Create Execution Definition.
The New Execution Definition dialog box displays. Enter the specifications of your new execution definition. Note: All selected test definitions must be in the same container. If not, the execution definition is not created and an error message displays. Note: The test container is preselected in the New Execution Definition dialog box and can not be altered. Related Concepts Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Working With Test Definitions in Grid View Working with Filters Adding Execution Definitions Related Reference Test Plan Grid View Test Plan Unit Interface
445
Click Test Plan on the workflow bar. Click Grid View on the toolbar to display Grid View. Right-click a column header. Expand the Columns submenus to view all the columns that are available in the project. Check the check boxes of all the columns you want to have displayed in Grid View. Your column-display preferences will be saved and displayed each time you open the active project.
Related Concepts Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Working With Test Definitions in Grid View Working with Filters Related Reference Test Plan Grid View Test Plan Unit Interface
446
Click Test Plan on the workflow bar. Click Grid View on the toolbar to display Grid View. Right-click the header of the text-based column that the filter is to be based on. Expand the Filters submenu on the context menu to display the Filters text box Enter a text string into the text box. Press ENTER. All test definitions that match the filter criteria (for example, in the case of test definition names, all test-definition names that include the specified string) are then dynamically displayed in the filtered list.
Click Test Plan on the workflow bar. Click Grid View on the toolbar to display Grid View. Right-click the header of the date-based column that the filter is to be based on. Hold your cursor over Filter on the context menu to display the Before, After, and On submenu. Hold your cursor over After to define a date before which (and including) all test definitions should be excluded. Hold your cursor over Before to define a date after which (and including) all test definitions should be excluded. Hold your cursor over On to exclude all test definitions except those that have the specified date. The calendar tool displays. Select a date using the calendar tool (or click Today to specify today's date). Tip: You must explicitly click a date on the calendar tool or click ENTER to activate date-based filtering changes.
All test definitions that match the filter criteria are then dynamically displayed in the filtered list.
Click Test Plan on the workflow bar. Click Grid View on the toolbar to display Grid View. Right-click the header of the number-based column that the filter is to be based on. Expand the Filters submenu on the context menu to display the > (greater than), < (less than), and = (equals) operators. Enter a number in the > text box to define a number less than which (and including) all test definitions should be excluded. Enter a number in the < text box to define a number greater than which (and including) all test definitions should be excluded. Enter a number in the = text box to exclude all test definitions except those that have the specified number.
447
Note:
6
Press ENTER. All test definitions that match the filter criteria are then dynamically displayed in the filtered list.
Click Test Plan on the workflow bar. Click Grid View on the toolbar to display Grid View. Right-click the header of the Boolean-based column that the filter is to be based on. Expand the Filters submenu on the context menu to display the available values. Click one of the Yes or No option buttons. All test definitions that match the filter criteria are then dynamically displayed in the filtered list.
Click Test Plan on the workflow bar. Click Grid View on the toolbar to display Grid View. Right-click the header of the column that has a predefined filter value (for example, NodeType) that the filter is to be based on. Expand the Filters submenu on the context menu to display the available values. Check the check boxes of the filter values that you are interested in. All test definitions having one of the selected criteria will be displayed.
Related Concepts Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Working With Test Definitions in Grid View Working with Filters Related Reference Test Plan Grid View Test Plan Unit Interface
448
Click Test Plan on the workflow bar. Click Grid View on the toolbar to display Grid View. Right-click the header of the column that the sort is to be based on. Select Group by This Field. Test definitions are then organized into groups based on commonly-shared values within the column you have selected.
Click Test Plan on the workflow bar. Click Grid View on the toolbar to display Grid View. Right-click any column. Uncheck the Show in Groups check box.
Related Concepts Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Working With Test Definitions in Grid View Working with Filters Related Reference Test Plan Grid View Test Plan Unit Interface
449
Click Test Plan on the workflow bar. Click Grid View on the toolbar to display Grid View. Right-click a test definition row. Select Go to test to advance to the node of the test definition in Test Plan view. Note: Alternatively, you can click a test's ID link in Grid View to advance to the associated test definition.
Related Concepts Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Working With Test Definitions in Grid View Working with Filters Related Reference Test Plan Grid View Test Plan Unit Interface
450
Click Test Plan on the workflow bar. Click Grid View on the toolbar to display Grid View. Note: You can identify filtered columns by their titles, which are displayed in bold, italic text.
3 4
Right-click the header of the column that has the filter you want to remove. Uncheck the Filters check box.
Click Test Plan on the workflow bar. Click Grid View on the toolbar to display Grid View. Right-click any column header. Select Reset Filters.
Related Concepts Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Working With Test Definitions in Grid View Working with Filters Creating a Filter for a Folder or Container Related Reference Test Plan Grid View Test Plan Unit Interface
451
Click Test Plan on the workflow bar. Click Grid View on the toolbar to display Grid View. Select the column header of the column you want to move. Drag the column to the desired position and release it. Your column-order preferences will be saved and displayed each time you open the active project.
Related Concepts Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Working With Test Definitions in Grid View Working with Filters Related Reference Test Plan Grid View Test Plan Unit Interface
452
Click Test Plan on the workflow bar. Click Grid View on the toolbar to display Grid View. Select the vertical column-header divider of the column you want to adjust. Drag the column boundary to the desired position and release it. Your column-width preferences will be saved and displayed each time you open the active project.
Related Concepts Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Working With Test Definitions in Grid View Working with Filters Related Reference Test Plan Grid View Test Plan Unit Interface
453
Click Test Plan on the workflow bar. Click Grid View on the toolbar to display Grid View. Right-click any column header. Select Reset View.
Related Concepts Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Working With Test Definitions in Grid View Working with Filters Related Reference Test Plan Grid View Test Plan Unit Interface
454
Click Test Plan on the workflow bar. Click Grid View on the toolbar to display Grid View. Right-click the header of the column you want the test definitions to be sorted by. Select Sort Ascending to have the test definitions sorted in ascending order (or select Sort Descending to have the test definitions sorted in descending order). Your sort preferences will be saved and displayed each time you open the active project.
Related Concepts Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Working With Test Definitions in Grid View Working with Filters Related Reference Test Plan Grid View Test Plan Unit Interface
455
Click Test Plan on the workflow bar. Select Document View or Test Plan View from the toolbar Right-click the folder or container you want to filter and choose Filter Subtree.
Note: To remove filtering and display all elements, select <No Filter> from the Filter list box on the toolbar. Note: Empty folders are not shown in the filtered subtree. Related Concepts Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Related Reference Test Plan Toolbar Functions Test Plan Unit Interface
456
Click Test Plan on the Workflow Bar. Right-click within the Test Plan tree. Select a collapse or expand option.
Related Concepts Test Plan Tree Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Creating Test Definitions Managing Test Plans Related Reference Test Plan Unit Interface
457
Click Test Plan on the workflow bar. Select a container, folder, or test definition in the test plan tree. Click Test Plan View in the toolbar. Select the History tab. The properties of all elements are then displayed in tabular format.
Related Concepts Recent Changes Change-Notification Emails Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Managing Test Plans Related Reference Requirement History tab Test Plan Toolbar Functions Test Plan Unit Interface
458
Click Test Plan on the workflow bar. Select the project node or a test container node in the Test Plan tree. Click Update Execution on the toolbar. Note: Alternatively you can right-click a test container node or the project node and select Update Executions .
Related Concepts Execution Definitions Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Managing Test Plans Related Reference Test Plan Toolbar Functions Test Plan Assigned Executions tab
459
Go to Help
Tools
Click the Upload Manager link and save SetupUploadManager.exe to your local system using your browser's download dialog. Double-click SetupUploadManager.exe to start the InstallShield Wizard for Upload Manager. Follow the InstallShield Wizards prompts, entering your name, company name, and target destination for the installation. Click Finish to complete the installation.
Start Upload Manager by double-clicking the applications executable file (UploadManager.exe). The Select Target Location dialog box displays. Select the option to upload the file to the SilkCentral (server file pool). Click Next. Click Add (unless the file you want to upload is already visible in the Filename field) to browse to and select the file that you are uploading. Note: It is not possible to add file descriptions when uploading files to the file pool.
4 5
When the file that you want to upload displays in the Filename field, click Next. Enter the connection parameters for your SilkCentral installation, beginning with the Hostname of the computer that hosts your Test Manager installation (note that the name should not include a protocol designation). Enter the installations Port. Check the Secure check box if the connection is a secure connection (such as HTTPS). Enter the Username and Password login credentials that are required for your SilkCentral server. (Optional) Click Set as Default to have these parameters presented to you automatically the next time you run Upload Manager. (Optional) Click Internet Options to configure Internet settings for your connection (for example, proxy server settings).
Click Next. Verify all of the information you have entered in the Upload files field. Click Back if you need to make any changes on a previous page. Accept the default setting for the Close this window when the upload is complete check box. Click Finish to begin the upload process. When the upload is complete, Upload Manager will close and the uploaded file will be available in the SilkCentral file pool.
460
Related Concepts Upload Manager Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Creating New Issues Creating Test Definitions Managing Test Plans
461
Click Test Plan on the workflow bar. Select the test plan for which you want to view the assigned executions. Select the Assigned Executions tab to view the complete list of executions that are assigned to the selected test plan.
Related Concepts Execution Definitions Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Managing Test Plans Related Reference Test Plan Assigned Executions tab
462
Click Test Plan (or Requirements) on the workflow bar. Click Show Changes to filter out all requirements, test definitions, folders, and containers except those that have been changed since your last change acknowledgement (note that the recent changes filter is selected automatically in the Filter list box). Recent-change filtering is active across the Test Plan View tabs and Document View. Note: Once the recent changes filter has been activated, click Show Changes to toggles the Show All mode. Click Show Changes again to remove filtering and see all test definitions in the tree view.
When you have reviewed the changes, you can accept them by clicking Acknowledge. The acknowledge function resets the recent changes filter. Note: All test-plan changes generate time-stamped entries in the test plan history.
Related Concepts Recent Changes Change-Notification Emails Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Related Reference Requirement History tab Test Plan Toolbar Functions Test Plan Unit Interface
463
464
465
Click Execution on the workflow bar. Select an execution definition in the Execution tree view. Click the Runs tab. Select the execution definition run. The test definition section of the Runs page lists the test definition runs. Click on the Run ID of the test definition. The Test Definition Run Results dialog box displays. On the Details page, click Change Status to open the Change Status dialog box. Select the new status for the test definition run from the New Status list box. Type an explanation for the manual status change in the Comment text box. Note that inserting a comment is mandatory. Click OK to confirm the status change.
Note: Status changes produce history changes. To view the history of all status changes for the test definition execution run, click the Messages tab in the Test Definition Run Results dialog box. Related Concepts Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Creating Test Definitions Working with Data-Driven Tests Executing Test Definitions Related Reference Test Definition Run Results Dialog Execution Runs Tab
466
Click Execution on the workflow bar. Select an execution definition in the Execution tree view. Select the Runs tab. Click Delete (in the Actions column) of the execution run for which you want to delete results. Click Yes on the subsequent confirmation dialog box to complete the deletion.
Note: To delete some or all of the results of an entire execution definition, click Delete Results in the lower pane. Related Concepts Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Creating Test Definitions Deleting the Results of an Execution Definition Working with Data-Driven Tests Executing Test Definitions Related Reference Test Definition Run Results Dialog Execution Runs Tab
467
Click Execution on the workflow bar. Select an execution definition in the Execution tree view. Select the Runs tab. Click Delete Results in the lower pane (note that this button is only available when results are available for deletion). The Delete Results dialog box displays. Specify which results you want to delete:
All runs except the last run deletes the results of all execution definition runs except the results
of the most recent run.
All runs within the time span allows you to define a specific time span during which run results
are to be deleted. With this option selected, click the calendar tool.
6
Note: In the Actions column of a specific execution run, click Related Concepts Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Creating Test Definitions Deleting Individual Test Run Results Working with Data-Driven Tests Executing Test Definitions Related Reference Test Definition Run Results Dialog Execution Runs Tab
468
Click Execution on the workflow bar. Select an execution definition in the Execution tree view. Select the Runs tab. Click the Run ID of the execution for which you want to see details. Detailed information about the results of the execution definition is displayed.
Related Concepts Test Definition Execution Execution Definition Run Results Dialog Related Procedures Managing Test Executions - Quick Start Task Creating Test Definitions Working with Data-Driven Tests Executing Test Definitions Related Reference Test Definition Run Results Dialog Execution Runs Tab
469
470
Click Execution on the workflow bar. Select an execution definition. Select the Assigned Test Definitions tab. In the Actions column of a test definition, click is stored in. to find out in which test folder or container the test definition
The corresponding parent folder is then expanded and the assigned test definition is highlighted in blue.
Related Concepts Test Definition Execution Execution Definition Schedules Related Procedures Managing Test Executions - Quick Start Task Adding Execution Definitions Executing Test Definitions Related Reference Execution Assigned Test Definitions Tab
471
Click Execution on the workflow bar. Select an execution definition. Select the Assigned Test Definitions tab. In the Actions column, click the delete button (resembles an X mark) of the assigned test definition you are deleting. Repeat this step for all assignments that you want to delete. Tip: To remove all assigned test definitions, click Remove All.
Click Apply to save the assigned test-definition list. Note: If you do not click Apply, changes you make to the Assigned Test Definitions will be lost.
Related Concepts Test Definition Execution Execution Definition Schedules Related Procedures Managing Test Executions - Quick Start Task Adding Execution Definitions Executing Test Definitions Related Reference Execution Assigned Test Definitions Tab
472
Test Definition Name Test Definition Status Last Execution of the test definition
To assign one or more test definitions from the test plan Grid View to one or more execution definitions:
1 2 3 4 5 6 7 8
Click Test Plan on the workflow bar. Click Grid View on the toolbar Select the test definitions you want to assign to execution definitions. You can use your keyboard's Ctrl and Shift keys to select multiple test definitions using standard browser multi-select functions. Right-click the selected test definitions and choose Save Selection. Click Execution on the workflow bar. Select the execution definition to which you want to assign the selected test definitions. Choose Assigned Test Definitions. Click Assign Saved Selection. Note: Note: Only test definitions that reside in the execution definitions test container are inserted. You can insert the selected test definitions to more than one execution definitions. You can not insert them into requirements in a different project. The selection persists until you make a different selection or close Test Manager.
Related Concepts Test Definition Execution Execution Definition Schedules Related Procedures Managing Test Executions - Quick Start Task Adding Execution Definitions Executing Test Definitions Using a Filter to Assign Test Definitions to Execution Definitions Manually Assigning Test Definitions to Execution Definitions Related Reference Execution Assigned Test Definitions Tab
473
Click Test Plan on the workflow bar. Click Grid View on the toolbar to display Grid View. Select the test definitions you want to assign to your execution definition, by using the multi-select feature of the Grid View. Right-click the test definitions and choose Create Execution Definition.
The New Execution Definition dialog box displays. Enter the specifications of your new execution definition. Note: All selected test definitions must be in the same container. If not, the execution definition is not created and an error message displays. Note: The test container is preselected in the New Execution Definition dialog box and can not be altered. Related Concepts Test Plan Management Related Procedures Managing Test Plans - Quick Start Task Managing Test Plans Working With Test Definitions in Grid View Working with Filters Adding Execution Definitions Related Reference Test Plan Grid View Test Plan Unit Interface
474
Test Definition Name Test Definition Status Last Execution of the test definition
Click Execution on the workflow bar. Select the execution definition to which you are assigning test definitions. Select the Assigned Test Definitions tab. All test definitions of the test container which is associated with the selected execution are displayed in the Available Test Definitions window. Click the assign arrow of any test definition that you want to assign to the currently selected execution definition. Clicking the assign arrow of a folder or the top-level container assigns all child test definitions of that parent to the selected execution definition. Click Apply to save the assigned test-definition list. Note: If you do not click Apply, changes you make to the Assigned Test Definitions will be lost.
Related Concepts Test Definition Execution Execution Definition Schedules Related Procedures Managing Test Executions - Quick Start Task Adding Execution Definitions Executing Test Definitions Assign Test Definitions from Grid View to Execution Definitions Using a Filter to Assign Test Definitions to Execution Definitions Related Reference Execution Assigned Test Definitions Tab
475
Test Definition Name Test Definition Status Last Execution of the test definition
Create a filter in the Test Plan unit. Refer to the Creating Filters procedure for details. If the filter already exists, skip this step. Click Execution on the workflow bar. Select the execution definition to which you are assigning test definitions. Select the Assigned Test Definitions tab. All test definitions of the test container which is associated with the selected execution are displayed in the Available Test Definitions window. Select By Filter from the test definition assignment types. Choose the filter from the list box. Click Apply to save the assigned test-definition list. Note: If you do not click Apply, changes you make to the Assigned Test Definitions will be lost.
If you assign test definitions to an execution definition in Test Plan Grid View , the test definition assignment type is automatically set to Manual, but the previously-filtered test definitions remain in the Assigned Test Definitions tab.
Related Concepts Test Definition Execution Execution Definition Schedules Related Procedures Managing Test Executions - Quick Start Task Adding Execution Definitions Executing Test Definitions Creating Filters Assign Test Definitions from Grid View to Execution Definitions Manually Assigning Test Definitions to Execution Definitions Related Reference Execution Assigned Test Definitions Tab
476
477
Click Execution on the workflow bar. Select the execution definition for which you want to assign the SilkTest AUT host. Select the Deployment tab. Click Edit in the SilkTest AUT Hostname area of the GUI. The Edit SilkTest AUT Hostname dialog box displays. In the Hostname field, type the name of the computer where the SilkTest agent runs. Proper configuration of option files is required. See SilkTest documentation regarding the command-line option -m for details. Click OK to add the SilkTest AUT host to the selected execution definition.
Related Concepts Specifying Agent Under Test (AUT) SilkTest Tests Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Executing Test Definitions Related Reference Execution Deployment tab
478
Click Execution on the workflow bar. Select the execution definition for which you are removing a tester assignment. Select the Deployment tab. Click Edit in the Manual Testers area. The Manual Testers dialog box displays. All testers that have been assigned to the selected execution definition are listed in the Selected column. Select the name of the assigned user that you want to remove and click Remove to remove the user from the Selected list; or click Remove All to remove all tester assignments for the execution definition.
Related Concepts Test Definition Execution Manual Test Definitions Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Executing Test Definitions Related Reference Execution Deployment tab Current Run Page
479
Click Execution on the workflow bar. Select the execution definition for which you are assigning a tester. Select the Deployment tab. Click Edit in the Manual Testers area. The Manual Testers dialog box displays. In the Available column, select the User Group Name of which the tester is a member. The available list is populated with all members of the user group. Select the name of the user you want to assign as a manual tester and click Add to add the user to the Selected list; or click Add All to add all of the groups members and testers.
Related Concepts Manual Test Definitions Test Definition Execution Tour of the Manual Testing Client UI Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Executing Test Definitions Related Reference Execution Deployment tab Current Run Page
480
Click Execution on the workflow bar. Select an execution definition in the Execution tree. Select the Deployment tab. Click Edit in the Execution Environment portion of the page. The Assign Keywords dialog box displays. All keywords that have been defined for your execution environment are listed here. Note: For automated execution definitions, the default reserved keywords for each execution server (#<execution name>@<location name>) are included in the list.
Select keywords in the Select keywords list that reflect your execution environment requirements. You can use your keyboard's CTRL and SHIFT keys to select multiple keywords using standard browser multi-select functions. Tip: The Select keywords field is auto-complete enabled. When you enter alphanumeric characters into this field, the field is dynamically updated with an existing keyword that matches the entered characters. Note that this field is disabled when multiple keywords are selected in the Select keywords or Assigned Keywords lists. For automated execution definitions, if you only have a few execution servers and do not require hardware provisioning, you can likely get by using only the default, reserved keywords that are created for each execution server. In such cases, it is not necessary that you select additional keywords.
Tip:
Click Add (>) to move the keyword into the Assigned Keywords list. Note: For automated execution definitions, the execution servers that match the assigned keywords are listed below in the dynamically-updated Matching execution servers list. This list updates each time you add or remove a keyword. Click on the name of an execution server in the list to access the execution servers in Administration Locations.
Click OK to save the keywords and close the Assign Keywords dialog box.
481
Related Concepts VMware Lab Manager Virtual Configurations Execution Definitions Test Definition Execution Execution Definition Schedules Related Procedures Managing Test Executions - Quick Start Task Configuring Deployment Environments Executing Test Definitions Creating New Keywords Removing Keywords from Execution Definitions Related Reference Execution Deployment tab
482
Click Execution on the workflow bar. Select an execution definition in the Execution tree. Select the Deployment tab. Click Edit in the Execution Environment portion of the page. The Assign Keywords dialog box displays. All keywords that have been defined for your execution environment are listed here. Note: For automated execution definitions, the default reserved keywords for each execution server (#<execution name>@<location name>) are included in the list.
On the Assign Keywords dialog box, enter an alphanumeric keyword into the Keyword field that describes the required environment for the execution definition (for example, platform, operating system, and pre-installed applications). The following characters can not be used in keywords: #$?*\,;'" Note: Keywords are case insensitive (for example, Vista and vista are handled as the same keyword).
Press the ENTER key. The new keyword is now available for assignment.
Related Procedures Assigning Keywords to Execution Definitions Removing Keywords from Execution Definitions
483
Click Execution on the workflow bar. Select an execution definition in the Execution tree. Select the Deployment tab. Click Edit in the Execution Environment portion of the page. The Assign Keywords dialog box displays. All keywords that have been defined for your execution environment are listed here. Note: For automated execution definitions, the default reserved keywords for each execution server (#<execution name>@<location name>) are included in the list.
On the Assign Keywords dialog box, select unneeded keywords in the Assigned keywords field. You can use your keyboard's CTRL and SHIFT keys to select multiple keywords using standard browser multi-select functions. Click Remove (<) to remove the keyword assignments. Click OK to close the Assign Keywords dialog box.
6 7
Note: Keywords that are not in use anymore are automatically deleted from the system. Related Procedures Assigning Keywords to Execution Definitions Creating New Keywords
484
485
Click Execution on the workflow bar. Select the execution definition that will act as the master execution definition. Select the Dependencies tab. Click Add dependent Execution Definition to display the Add dependent Execution Definition dialog box. From the Condition selection list, select the condition that is to trigger the dependent execution definition (Passed, Failed, Not Executed, or Any). The Any status means that the dependent test execution will trigger no matter what the status of the previous test execution. From the tree menu in the dialog box, select the execution definition that is to be dependent. Select one of the following options to specify where the dependent execution definition is to be deployed:
6 7
As specified in the dependent Execution Definition: Automated test definitions assigned to the dependent
execution definition will be executed on the execution server specified for the dependent execution definition on the Deployment tab. Manual test definitions assigned to the dependent execution definition will be assigned to the users specified for the dependent execution definition on the Deployment tab. dependent execution definition will be executed on the execution server specified for the <selected execution definitions execution server> on the Deployment tab. Manual test definitions assigned to the dependent execution definition will be assigned to the users specified for the <selected execution definitions execution server> on the Deployment tab. tester from the list boxes. Automated test definitions assigned to the dependent execution definition will be executed on the specified execution server. Manual test definitions assigned to the dependent execution definition will be assigned to the specified manual tester. If only a specific manual tester is defined and no server, only manual test definitions will be executed. If only a specific execution server is defined and no manual tester, only automated test definitions will be executed.
Same as <selected execution definitions execution server>: Automated test definitions assigned to the
Specific: Execution Server/Manual Tester: Select a pre-configured execution server and/or a manual
Click OK to create the dependency. Note: Note: Test Manager will not allow you to create cyclical execution dependencies. You can select conditions to fulfill for manual test definitions. (Example: If the selected condition is Failed and all manual tests passed, but some automated tests failed, only automated test definitions assigned to the dependent execution definition will be executed).
486
Related Concepts Execution Dependency Configuration Test Definition Execution Execution Definition Schedules Related Procedures Managing Test Executions - Quick Start Task Executing Test Definitions Related Reference Execution Dependencies tab
487
Deleting a Dependency
To delete a dependency
1 2 3 4 5
Click Execution on the workflow bar. Select the master execution definition from which you want to delete a dependency. Select the Dependencies tab. In the Dependent Execution Definitions area, click the Delete icon in the Actions column. Click Yes on the Delete Dependency dialog box to delete the dependency.
Related Concepts Execution Dependency Configuration Test Definition Execution Execution Definition Schedules Related Procedures Managing Test Executions - Quick Start Task Executing Test Definitions Related Reference Execution Dependencies tab
488
Editing a Dependency
Note: To edit an existing dependency, you must select the master execution definition (the definition for which a specific condition will trigger the execution of another execution definition). You cannot edit dependency settings from an execution definition that is dependent on another execution definition.
Click Execution on the workflow bar. Select the master execution definition that you are editing. Select the Dependencies tab. In the Dependent Execution Definitions area, click Edit settings in the Actions column to open the Edit Dependency dialog box. Edit the condition that is to trigger the dependent execution and execution server settings.
Related Concepts Execution Dependency Configuration Test Definition Execution Execution Definition Schedules Related Procedures Managing Test Executions - Quick Start Task Adding Dependent Execution Definitions Executing Test Definitions Related Reference Execution Dependencies tab
489
490
Select Execution on the workflow bar. Select an execution definition for which you want to add a definite run. Select the Schedules tab. Click the Custom option button. Click Add Definite Run. On the Configure Definite Run page, select the date and time when the execution definition should definitely be run. Click OK. Your definite run settings are listed on the Configure Schedule page. Click Save to add the definite run to the current schedule, or continue adding definite runs.
Related Concepts Execution Definition Schedules Test Definition Execution Execution Definitions Related Procedures Managing Test Executions - Quick Start Task Adding Execution Definitions Executing Test Definitions Related Reference Execution Schedule tab
491
Adding Exclusions
Note: You must have administrator rights to edit global schedules. To define a scheduling exclusion for a global schedule, navigate to Administration Configuration.
Click Execution on the workflow bar. Select an execution definition for which you want to add a scheduling exclusion. Select the Schedules tab. Click the Custom option button. Click Add Exclusion. On the Configure Schedule Exclusion page, select the weekdays on which test definitions should be suppressed. Define the specific time intervals on those days during which execution should be suppressed. Click OK. Your exclusion settings are now listed on the Configure Schedule page. Click Save to add the exclusion to the current schedule, or continue adding additional exclusions.
Related Concepts Execution Definition Schedules Test Definition Execution Execution Definitions Related Procedures Managing Test Executions - Quick Start Task Editing Exclusions Adding Execution Definitions Executing Test Definitions Related Reference Execution Schedule tab
492
Click Execution on the workflow bar. Select an execution definition for which you want to configure a custom schedule. Note: Note: To schedule a folder for execution, select a folder node. To save an edited version of a global schedule as a custom schedule, click Edit while the global schedule is selected in the list box. This enables you to edit the global schedule and save the result as a custom schedule.
3 4 5
Select the Schedule tab. Click the Custom option button to enable the scheduling controls. Click next to the From field and specify when the execution schedule is to begin (Month, Day, Year, Hour, Minute) using the calendar tool. Specify the interval at which the executions tests are to be executed (Day, Hour, Minute). In the Run portion of the GUI, specify when the execution is to end. Select Forever to define a schedule with next to the to field and specify when the execution schedule is to end (Month, Day, no end, or click Year, Hour, Minute) using the calendar tool.
6 7
8 9
(Optional) Click Add Exclusion to define times when scheduled execution definitions should not be executed. Or click Add Definite Run to define times when unscheduled executions should be executed. Click Save to save your custom schedule.
Related Concepts Execution Definition Schedules Test Definition Execution Execution Definitions Related Procedures Managing Test Executions - Quick Start Task Adding Definite Runs Adding Exclusions Adding Execution Definitions Executing Test Definitions Related Reference Execution Schedule tab
493
Click Execution on the workflow bar. Select the execution definition for which you are deleting a previously configured definite run. Select the Schedule tab. In the Actions column, select the Delete icon of the definite run to be deleted.
Related Concepts Execution Definition Schedules Test Definition Execution Execution Definitions Related Procedures Managing Test Executions - Quick Start Task Adding Execution Definitions Executing Test Definitions Related Reference Execution Schedule tab
494
Click Execution on the workflow bar. Select the execution definition for which you are editing a previously configured definite run. Select the Schedule tab. In the Actions column, select the Edit Definite Run icon of the definite run you have selected to edit. Edit the definite run criteria as required and then click Save.
Related Concepts Execution Definition Schedules Test Definition Execution Execution Definitions Related Procedures Managing Test Executions - Quick Start Task Adding Execution Definitions Executing Test Definitions Related Reference Execution Schedule tab
495
Click Execution on the workflow bar. Select the execution definition for which you are configuring a schedule. Note: To schedule a folder for execution, select a folder node.
3 4 5
Select the Schedule tab. Click the Global option button. Select the required pre-defined schedule from the Global list box. Details of the pre-defined schedule are displayed in a read-only calendar view. Note: Note: To save an edited version of a global schedule as a custom schedule, click Edit. Global schedules are configured through the Configurations link on the menu tree (Schedule tab).
Related Concepts Execution Definition Schedules Test Definition Execution Execution Definitions Related Procedures Managing Test Executions - Quick Start Task Adding Execution Definitions Executing Test Definitions Related Reference Execution Schedule tab
496
Click Execution on the workflow bar. Select the execution definition for which there is to be no schedule. Select the Schedule tab. Click the None option button.
Related Concepts Execution Definition Schedules Test Definition Execution Execution Definitions Related Procedures Managing Test Executions - Quick Start Task Adding Execution Definitions Executing Test Definitions Related Reference Execution Schedule tab
497
Deleting Exclusions
To delete an exclusion:
1 2 3 4
Click Execution on the workflow bar. Select the execution definition for which you want to delete a previously configured exclusion. Select the Schedule tab. In the Actions column, select the Delete button of the exclusion you want to delete.
Related Concepts Execution Definition Schedules Test Definition Execution Execution Definitions Related Procedures Managing Test Executions - Quick Start Task Adding Exclusions Adding Execution Definitions Executing Test Definitions Related Reference Execution Schedule tab
498
Editing Exclusions
To edit an exclusion:
1 2 3 4 5
Click Execution on the workflow bar. Select the execution definition for which you want to edit a previously configured exclusion. Select the Schedule tab. In the Actions column, select the Edit Exclusion button of the exclusion you want to edit. Edit the exclusion as required and click Save.
Related Concepts Execution Definition Schedules Test Definition Execution Execution Definitions Related Procedures Managing Test Executions - Quick Start Task Adding Exclusions Adding Execution Definitions Executing Test Definitions Related Reference Execution Schedule tab
499
500
501
502
Navigate to Window
Preferences.
Enter the URL of your Test Manager installation in the Test Manager Server URL field. Select Remember Credentials if you want the Manual Testing Client to insert your login credentials automatically the next time you start the application. Enter your Test Manager Username and Password. Click Validate Connection to test your login settings. Click OK on the confirmation dialog box. Click OK to save your settings.
Related Concepts Manual Test Definitions Test Definition Execution Tour of the Manual Testing Client UI Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Executing Test Definitions
503
Navigate to Window
Preferences.
Download attachments automatically - Download attached files automatically when execution definition
packages are downloaded from Test Manager. This option must be enabled if you intend to work offline after you download your assigned execution packages. a prompt requesting the build number on which the test was performed.
Ask for build number when completing packages - Before uploading packages to Test Manager, display Show execution dialog always on top - Have the Execute Test dialog box display on top of other open
windows on your computer desktop to facilitate manual testing. When enabled, the Execute Test dialog box stays on top even when another window has the focus. When executing manual tests, you may want to keep the Execute Test dialog box on top so that you can easily enter your test results. If your computer monitor is too small to contain both the Execute Test dialog box and the application under test, you should leave this setting disabled. confirmation prompt before closing the Manual Testing Client.
Ask for uploading workspace to SilkCentral Test Manager before closing main window - Display a
3
Related Concepts Manual Test Definitions Test Definition Execution Tour of the Manual Testing Client UI Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Executing Test Definitions
504
Navigate to Window
Preferences.
In the Packages area of the dialog box, check the Remove uploaded packages check box to define an option for automatic deletion of execution definition packages from the Manual Testing Client after packages are uploaded to Test Manager. Select one of the following deletion options:
Immediately After <x> days (enter a value in the days field if you select this option)
4
Check the Upload completed packages immediately check box if you want to have completed test run packages uploaded to Test Manager automatically after test runs are completed.
Related Concepts Manual Test Definitions Test Definition Execution Tour of the Manual Testing Client UI Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Executing Test Definitions
505
506
Copy a screen capture to your computers clipboard (the Paste Image button on the Result Files tab of the Manual Testing Client becomes enabled). Click Paste Image. Specify a File Name for the image on the Paste From Clipboard dialog box. Click OK to save the copied screen capture as an image file attachment.
Related Concepts Attachments Manual Test Definitions Test Definition Execution Manual Testing Client Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Executing Test Definitions
507
From within the Manual Testing Client, select a test definition in the Test Definitions tab. Click the Result Files tab. Click Add File to browse to and select the result file you want to upload. Click Open to attach the file.
Related Concepts Attachments Manual Test Definitions Test Definition Execution Tour of the Manual Testing Client UI Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Executing Test Definitions
508
From within the Manual Testing Client, select an image file in the Attachments tab. The image displays in the Image Preview field. Use the following viewing tools next to the Image Preview field to manipulate the image:
Show Actual Size Scale to Fit Scale to Fit - Keep Aspect Ratio Open as Detached Window
Related Concepts Attachments Manual Test Definitions Test Definition Execution Tour of the Manual Testing Client UI Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Executing Test Definitions
509
From within the Manual Testing Client, select a test definition in the Test Definitions tab. The Attachments tab lists all of the result files that are associated with the selected test definition. Using the Test Container/Folders and Test Steps check boxes, you can filter the list of attachments to include only those attachments that are related to the selected test container/folder or test step.
Related Concepts Attachments Manual Test Definitions Test Definition Execution Tour of the Manual Testing Client UI Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Executing Test Definitions
510
In the Manual Testing Client, select the Inbox tab. Select an execution definition package. The test definitions included in the selected package are listed in the Test Definitions tab. In the Test Definitions tab, double-click a test definition. Click New Internal Issue to open the New Issue dialog box. Fill out the text boxes as described in New Issue Dialog. Click OK.
Note: You must be online during this procedure. Related Concepts Issue Management Manual Test Definitions Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Executing Test Definitions Related Reference Manual Testing Client HTML Support for Description Text Boxes
511
From within the Manual Testing Client, right-click a test definition in the Test Definitions tab. Select an alternative status:
Set as Not Executed Set as Passed Set as Failed Set as Unresolved Set as Unsupported
Note: You can not change the status of test runs that have already been completed. The statuses of execution packages in the Completed Runs tab can not be edited.
Related Concepts Manual Test Definitions Test Definition Execution Tour of the Manual Testing Client UI Calculating the Test Definition Status Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Executing Test Definitions
512
From within the Manual Testing Client, click Download on the toolbar. If your connection settings have been correctly configured (and you have execution packages waiting for you), your assigned execution packages will appear in the Inbox.
Related Concepts Manual Test Definitions Test Definition Execution Manual Testing Client Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Executing Test Definitions
513
From within the Manual Testing Client, right-click an execution package in the Inbox. Select Edit Build Number . On the Select Build Number dialog box, select a build number from the Build list box. Note: Note: You can refresh the build list by clicking Refresh build list. If you want to be prompted to specify a build number each time a test run is completed, check the Ask for build number when completing packages check box,
Click OK.
Related Concepts Manual Test Definitions Test Definition Execution Tour of the Manual Testing Client UI Related Procedures Managing Test Executions - Quick Start Task Configuring Package Upload Preferences Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Executing Test Definitions
514
Double-click an execution package on the Inbox tab to open the Execute Test dialog box. Click Edit. You can now edit the following fields on the Details tab: Planned Time, Step Names, any custom step properties that have been created for your project, Step Description, and Expected Result. Manual test steps can also be added, reordered, and removed on the Details tab. On the Description tab, the following fields can be edited: Test Definition Name and Test Definition Description.
Details.
Click Add test step on the toolbar to add a new test step to the end of the test step list. Tip: To insert a new test step into the test step list, select the test step above which the new test step is to appear. Click Insert test step on the toolbar.
While in Edit mode, navigate to Execute Test Select a test step that you want to move.
Details.
Click Move Up on the toolbar to move the step up one position in the test step list, or click Move Down to move the step down one position in the list.
While in Edit mode, navigate to Execute Test Description or Execute Test Details. You can select any preconfigured Test Manager project parameters for insertion into the Test Definition Description, Step Description, and Expected Result fields. Place your cursor into one of the text fields. Click Parameters on the far right-end of the toolbar, Select a preconfigured Test Manager project parameter from the list box.
2 3 4
After completing your manual test definition edits, click Upload to upload your results to the server. Click Yes to confirm that you want to have your changes committed to the Test Plan tree on the server.
515
If your changes conflict with recent changes made by another user, the Test Definition Conflicts dialog box will display, listing the test definitions that are in conflict. Tip: You can directly access any conflicting test definition in Test Manager to view what was changed by right-clicking the test definition and selecting Go to Test Definition in Test Manager.
Click Upload Changes to ignore changes made by other users and commit your changes to Test Manager (thereby overwriting any recent changes that conflict with your changes). Or click Revert Changes to not have your changes saved to the test definition. If you opt for Revert Changes, your changes will not be committed to the Test Plan tree, however your changes will be visible in the execution results you are uploading. Your changes will not be included in future runs of the test definition.
Related Concepts Manual Test Definitions Test Definition Execution Test Definition Parameters Test Definitions in the Manual Testing Client Manual Testing Client Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Uploading Test Results to Test Manager Executing Test Definitions
516
On the Edit Code Analysis Settings dialog box, proceed with enabling code analysis for the execution definition. Note: After code analysis is enabled, you can execute your test definitions in the Manual Testing Client. However, you need to click Code Analysis: Start on the Execute Test dialog box before you actually start testing. This way Test Manager will collect code analysis information while you execute the manual test. When you are done testing, click Stop to halt the collection of code analysis information.
Related Concepts Test Manager Code Analysis Manual Test Definitions Code Coverage Analysis Related Procedures Analyzing Test Results - Quick Start Task Enabling Code Analysis for Execution Definitions Executing Manual Tests Analyzing Code Coverage Related Reference Execution Deployment tab Code Analysis Unit Interface
517
From within the Manual Testing Client, select the Inbox tab. Select an execution definition package. The test definitions included in the selected package appear in the Test Definitions tab. Click Execute. The Execute Test dialog box of the first test of the selected package opens, at the Details tab. The Details tab enables you to edit the results of each test step as you progress through a test. The following properties for the selected test definition are available: Test Definition Status shows test status (Passed, Failed, Not Executed, Unsupported, or Unresolved). This field can be edited. Planned Time shows estimated time for completion of the test. Used Time tracks elapsed time since the beginning of the test execution. Note: The Used Time field can be edited. Use Suspend/Resume to stop and restart the timer if you need to edit the timer setting (or pause the timer) during a test execution.
The Test Steps portion of the dialog box lists all of the steps that comprise the selected test definition. The following properties are included for each test step: Step Description includes the description that has been defined for the test step. This field can be edited. Note: Test Manager supports HTML formatting and cutting/pasting of HTML content for description fields.
Expected Result is the expected result of each test step (the success condition). Result includes the result of each test step as observed by the tester. Edit this field after you complete each step. Status includes the status of each step. Edit this field after you complete each step.
4
Once you have completed the first test step and edited the fields as required, select and complete any remaining test steps listed in the Test Steps field. Note: Click Next Test to open the next test in the selected execution definition (this button is displayed only if multiple test definitions exist), or click Previous Test to open the previous test in the execution definition.
5 6
Click Go to Issues to enter an issue (bug) for the selected manual test definition in Test Manager. When you have completed all steps in the test, click Finish Run to close the Execute Test dialog box.
To finish a test package before all test definitions have been executed:
1 2
If you attempt to complete testing of a test package by clicking Finish while any of the package's test definitions have a status of Not Executed, the Finish Run dialog box will display. Select a value from the list box to specify how the unexecuted test definitions should be handled:
Remove test definitions from this run (results of unexecuted tests will be removed from the package's
results)
518
Click OK.
Related Concepts Issue Management Manual Test Definitions Test Definition Execution Manual Testing Client Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Executing Test Definitions Related Reference Execute Test Dialog Box HTML Support for Description Text Boxes
519
Right-click an execution package in the Manual Testing Client's Inbox. Select Export Package. On the Export to dialog box, browse to the location where the package (.zpkg file) is to be saved and click Save. Click OK on the confirmation dialog box notifying you that the export was successful.
Navigate to File
Import Package
On the Import from dialog box, browse to the package (.zpkg file) that is to be imported and click Open. Click OK on the confirmation dialog box notifying you that the import was successful.
Related Concepts Manual Test Definitions Test Definition Execution Test Definitions in the Manual Testing Client Manual Testing Client Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Executing Test Definitions
520
Tip: To start SilkCentral Manual Testing Client if it is already installed, navigate to Start Programs Borland SilkCentral Test Manager Manual Testing Client.
When clicking a link to the Manual Testing Client Web Start URL (http://<Test Manager host>/ webstart/mtc/), for example in a manual testing notification email, a File Download dialog box opens. Click Open. If Manual Testing Client is not installed on your computer, the Java Web Start dialog box opens and immediately starts downloading Manual Testing Client. The download can take up to several minutes. If Manual Testing Client is already installed on your computer, Manual Testing Client opens and the following steps are not applicable.
3 4
When the download has completed, a Warning Security dialog box opens, asking you if you want to run the digitally signed application. Check the Always trust content from this publisher check box, then click Run. Manual Testing Client opens.
In the Windows command line window, type javaws viewer and press ENTER. The Java Cache Viewer opens. In the Show list box, select Applications, if not already selected. The table should now list Manual Testing Client Web Start. Select this application and click the delete button (X) on the toolbar. Manual Testing Client is now removed from your computer and you can close the Java Cache Viewer and Java Control Panel dialog boxes.
521
Related Concepts Manual Test Definitions Test Definition Execution Manual Testing Client Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Executing Test Definitions
522
From within the Manual Testing Client, complete a manual test by clicking Finish Run on the Execute Test dialog box or Finish on the toolbar. Note that if you attempt to complete testing of a test package while any of the package's test definitions have a status of Not Executed, the Finish Run dialog box will display on which you can define how unexecuted test definitions should be handled.
2 3
Select the Completed Runs tab. Right-click a completed test run and select Upload to Test Manager, or select Upload from the File menu, to upload your test results to Test Manager. Note: Execution definition statuses are updated automatically in the SilkCentral database when you are working online.
Select Store to SilkCentral from the File menu, to store your test results to Test Manager. Alternatively, when closing Manual Testing Client, you can upload your entire workspace to Test Manager.
Related Concepts Manual Test Definitions Test Definition Execution Manual Testing Client Related Procedures Managing Test Executions - Quick Start Task Configuring Package Upload Preferences Executing Manual Tests Using the Manual Testing Client Executing Manual Tests with the Manual Testing Client Working with Manual Tests Executing Test Definitions
523
From within the Manual Testing Client, right-click a test definition on the Test Definitions tab. Select Go to Test Definition in Test Manager. If prompted, enter your Test Manager login credentials. You will be directed to Test Managers Test Plan unit, where the corresponding test definition will be selected in the Test Plan tree.
Related Concepts Manual Test Definitions Test Definition Execution Manual Testing Client Related Procedures Managing Test Executions - Quick Start Task Configuring Package Upload Preferences Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Executing Test Definitions
524
Once you have completed your tests and have access to an Internet connection, proceed with uploading your test results.
Related Concepts Manual Test Definitions Test Definition Execution Tour of the Manual Testing Client UI Related Procedures Managing Test Executions - Quick Start Task Uploading Test Results to Test Manager Configuring Package Upload Preferences Executing Manual Tests Using the Manual Testing Client Working with Manual Tests Executing Test Definitions
525
Execution.
Click Continue Manual Test on the toolbar. The Manual Tests in Progress dialog box displays a list of all pending manual tests. Click Finish as not executed in the Action column of the manual test definition you want to remove. You can also abort all pending manual test definitions by clicking Remove All Tests.
Related Concepts Manual Test Definitions Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Working with Manual Tests Executing Test Definitions Related Reference Current Run Page
526
Execution.
Select the manual execution definition that you intend to execute. The Run dialog box displays. Define which test definitions you want to execute and click OK. Tip: Note: To go directly to the Current Run page, uncheck Go to Activities page. When you choose the Run command on a manual test execution node, you must perform the manual test yourself. When you choose the Run command on a folder however, all included manual tests within the folder must be executed by the testers who have been assigned to the folder on the Deployment page, not the testers who have been assigned to the individual execution definitions. Unless you have an automated test definition incorporated into the selected execution definition, you will be presented with a dialog informing you that No execution server has been specified for this execution definition. Manual tests do not use execution servers, so you can ignore this message and close the dialog.
Note:
If there are already pending manual tests in the selected execution definition, the Manual Tests In Progress dialog box displays.
Click Start New to create a new execution of the manual tests. Click Remove All Tests to finish all pending manual tests and set their status to Not Executed. Select a pending execution and click Continue to continue the selected execution.
6
On the Current Run page, proceed with the manual test execution.
527
Related Concepts Manual Test Definitions Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Aborting Manual Test Executions Working with Manual Tests Executing Test Definitions Related Reference Current Run Page Execution Deployment tab Run Dialog
528
Execution.
Select the execution definition with the assigned manual test that you want to execute. The Run dialog box displays. Define which test definitions you want to execute and click OK. If the selected test is already in progress, a new test run starts. Click Cancel to close the Manual Tests In Progress dialog box. The Current Run page opens. You are provided with detailed information on every test step. Click on the Status of a test step and change it to the appropriate status. Repeat the previous step for all test steps. Optional: Use your keyboard's CTRL and SHIFT keys to select multiple test steps using standard browser multiselect functions. Right click on your selection and set the status of the selected test steps to the selected status. Optional: Click Finish Run to finish a run without finishing every test definition. The Finish Run dialog box opens. Choose the appropriate build and the action to perform on the unfinished test definitions. If one test step fails, the whole test is marked as failed.
5 6 7 8 9
10 The Status of the manual test is changed to the cummulated status of the test steps when the test run finishes.
Related Concepts Manual Test Definitions Test Definition Execution Calculating the Test Definition Status Related Procedures Managing Test Executions - Quick Start Task Executing Manual Tests Aborting Manual Test Executions Working with Manual Tests Executing Test Definitions Related Reference Run Dialog Current Run Page
529
530
Click Execution on the workflow bar. Select the execution definition that is to be run. Click Run on the toolbar. The Run dialog box displays. Define which test definitions you want to execute. The execution definition is then queued on the specified execution server. Test definitions are executed in the order in which they are listed on the Assigned Test Definitions tab (Execution View). Details of executions can be viewed in the Projects unit, Activities tab. Note: If the execution definition contains manual tests that are still in progress, you will be presented with a list of these tests.
If the execution definition does not contain pending manual tests, the Go To Activities dialog box displays. Click Yes to view the Activities page, or click No if you want to remain on the current Web page. Note: Check the Don't show this dialog again (during this login session) check box if you do not want to be asked about switching to the Activities page again in the future. This setting will be discarded when you log out of Test Manager.
Related Concepts Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Updating Execution Definitions Assigning Keywords to Execution Definitions SilkTest Tests Working with Manual Tests Executing Test Definitions Related Reference Execution Assigned Test Definitions Tab Activities Page Run Dialog
531
532
Click Execution on the workflow bar. Select an existing folder in the Execution tree, or select the project node. Click New Execution Definition on the toolbar (or right-click within the Execution tree and choose New Child Execution Definition ). The New Execution Definition dialog box displays. Enter a name and meaningful description for the execution definition. Note: Test Manager supports HTML formatting and cutting and pasting of HTML content for Description fields.
Select a test container from the Test Container list box. The Version and Build that are associated with the product that the container is associated with are then populated automatically in the Version and Build fields. You may only associate one test container to a test execution. Select a product Version and Build from the list boxes. If a build information file is available on the execution server, you have the option to check the Read from Build Information file check box, in which case build and information will be read from the build information file for the test run, overriding any manual settings that have been selected on the New Execution Definition dialog box. Specify a Priority for the execution definition from the list box (Low, Normal, or High). In the Source Control Label field you can optionally specify that the execution definition be of an earlier version than the latest version. Click OK to update the Execution tree with the newly created execution definition.
7 8 9
Related Concepts Test Definition Execution Execution Definition Schedules Build Information Related Procedures Managing Test Executions - Quick Start Task Execution Definitions Working with Execution Definitions Executing Test Definitions Creating an Execution Definition in Grid View Related Reference Execution Unit Interface HTML Support for Description Text Boxes
533
Click Execution on the workflow bar. Select an execution definition in the Execution tree. Click Copy on the toolbar (or right-click the execution-definition node and select Copy ). Select the target folder where the execution definition is to be pasted. Click Paste on the toolbar (or right-click the execution-definition node and select Paste ). The Execution tree is updated with a copy of the pasted execution definition. All assigned test definitions, filters, and scheduling parameters are copied along with the execution definition.
Related Concepts Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Execution Definitions Working with Execution Definitions Executing Test Definitions Related Reference Execution Unit Interface
534
Click Execution on the workflow bar. Select an execution definition in the Execution tree. Click Delete on the toolbar (or right-click the execution-definition node and select Delete ). Click Yes on the deletion confirmation dialog to remove the execution definition from the Execution tree.
When deleting an execution definition, the run results of assigned test definititions are also deleted. The test definition run results may still appear in reports, because they are stored in the database, which is not immediately updated after the deletion of the execution definition. Related Concepts Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Execution Definitions Working with Execution Definitions Executing Test Definitions Related Reference Execution Unit Interface
535
Click Execution on the workflow bar. Select the node of the execution definition you are editing. Click Edit on the toolbar (or right-click the execution-definition node and select Edit ). The Edit Execution Definition dialog box displays. Edit the execution definition by modifying the criteria, such as the description and values, defined in the Edit Execution Definition dialog box. If there are no runs and no test definitions assigned to the execution definition, you can choose an alternative test container for the execution definition from the Test Container list box. Click OK to save the edited execution definition.
Related Concepts Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Execution Definitions Working with Execution Definitions Executing Test Definitions Related Reference Execution Unit Interface
536
537
Select Test Plan on the workflow bar. Select the test definition you are interested in viewing. Select the Runs tab. Click the Analyze Results icon of the test execution for which you want to download results. A File Download dialog box displays, showing you the name of the Performance Manager command (.sppecmd) file that you are about to download. Click Open to open the results in Performance Manager (alternatively, you can click Save to save the results locally). If not already open in the background, Performance Manager now opens, connected directly to your Test Manager installation, and fetches the results of the selected execution run. Note: To prepare for a cross load-test report that compares the results of multiple executions in a single report, you may download the results of additional executions from the Runs tab. Additional execution results are displayed in the existing instance of Performance Manager on the Performance Manager Test Manager tab. See Performance Manager documentation for more details regarding cross load-test reports.
Related Concepts Working With SilkPerformer Projects Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Working with SilkPerformer Projects Executing Test Definitions Related Reference Execution Unit Interface
538
Click Test Plan on the workflow bar. Select a SilkPerformer test definition. Select the Runs tab. Click the Download Results icon of the test execution for which you want to download results. A File Download dialog box displays, showing you the name of the compressed results package (.ltz) file that you are about to download. Click Open to open the results in Performance Manager (alternatively, you can click Save to save the results locally). If not already open in the background, Performance Manager now opens. You are presented with an Import Project dialog box that indicates the target directory to which the results will be saved. Click OK to accept the default path, or click Browse to select an alternate path. The downloaded results are then displayed in Performance Manager. Note: If you accept the default Projects directory where result packages are typically stored (generally recommended), then the results will be stored with all otherSilkPerformer results and will be readily accessible through the Performance Manager Add Loadtest Results command.
Related Concepts Working With SilkPerformer Projects Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Working with SilkPerformer Projects Executing Test Definitions Related Reference Execution Unit Interface
539
Click Test Plan on the workflow bar. Select a SilkPerformer test definition in the Test Plan tree view. On the Properties tab, scroll down to the SilkPerformer Test Properties section. Click the Download SilkPerformer Project icon. A file download dialog box displays, asking you to confirm that you wish to download the specified SilkPerformer project to your local system. Click Save to open the file in SilkPerformer. If not already open in the background, SilkPerformer will be invoked. The Select Target Directory dialog box displays, loaded with the default directory path to which the specified SilkPerformer project will be saved. If you approve of the specified pathname, click OK, otherwise click Browse to specify an alternate path. Note: Even if you have configured source-control integration you will not be prompted to check out the SilkPerformer project from your source-control system because you are working with this file independently of Test Manager. SilkPerformer projects utilized by Test Manager can also be downloaded directly from the SilkPerformer user interface. See SilkPerformer documentation for details.
Note:
Related Concepts Working With SilkPerformer Projects Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Working with SilkPerformer Projects Executing Test Definitions Related Reference Execution Unit Interface
540
Click Test Plan on the workflow bar. Select a SilkPerformer test definition. Select the Properties tab. Scroll down to the SilkPerformer Test Properties section. Click Edit SilkPerformer Test Properties. Proceed with the configuration of your SilkPerformer Test.
Related Concepts Working With SilkPerformer Projects Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Working with SilkPerformer Projects Executing Test Definitions Related Reference Execution Unit Interface
541
Click Test Plan on the workflow bar. Select a test definition. Select the Properties tab. Scroll down to the SilkPerformer Test Properties section. Click Run Attended Test. A File Download dialog box displays, asking you to confirm that you wish to run the specified SilkPerformer command file (.spwbcmd). Click Open to open the project in SilkPerformer. If not already open in the background, SilkPerformer will be invoked. The Select Target Directory dialog box opens, loaded with the default directory path to which the specified SilkPerformer project will be saved. If you approve of the specified pathname, click OK, otherwise click Browse to specify an alternate path. The SilkPerformer Workload Configuration dialog box opens with all of the workload settings that are associated with the SilkPerformer project. Edit the workload settings as required and click Run to begin the test and monitor the test results with SilkPerformer. Note: Clicking Run without editing any workload settings executes the SilkPerformer test in exactly the same way as if the test had been executed directly from Test Manager as an unattended test.
Related Concepts Working With SilkPerformer Projects Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Working with SilkPerformer Projects Executing Test Definitions Related Reference Execution Unit Interface
542
Select Test Plan on the workflow bar. Select a SilkPerformer test definition. Select the Properties tab. Scroll down to the SilkPerformer Test Properties section. Click the Open SilkPerformer Project icon. A file download dialog box displays, asking you to confirm that you wish to open the specified SilkPerformer command file (.spwbcmd) in SilkPerformer. Click Open to open the file in SilkPerformer. If not already open in the background, SilkPerformer will be invoked. The Select Target Directory dialog opens, loaded with the default directory path to which the specified SilkPerformer project will be saved. If you approve of the specifed pathname, Click OK, otherwise click Browse to specify an alternate path. If you have configured source-control integration for Test Manager (for example, Visual SourceSafe) you will now be presented with a login screen for your source-control client. Enter valid user connection settings and click OK to continue. Note: SilkPerformer projects utilized by Test Manager can also be opened directly from SilkPerformer. See SilkPerformer documentation for details.
Related Concepts Working With SilkPerformer Projects Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Working with SilkPerformer Projects Executing Test Definitions Related Reference Execution Unit Interface
543
Run an attended SilkPerformer test. When the test is complete, select the Upload Results to Test Manager command from the Results menu. The Login screen of the Upload Results to Test Manager wizard displays. Enter your password and click Next. Note: Because this is an attended test, the wizard already knows the appropriate hostname and username of the test definition to which these results are to be uploaded.
4 5
If not already selected by default in the project list, select the SilkCentral Test Manager project to which you want to upload the SilkPerformer results. If not already selected by default in the tree list, select the test definition to which you want to upload the results. Click Next. Note: You can right-click in the tree and use the commands on the context menu to create a new test definition, child test definition, test folder, and/or child test folder to which the results can be saved.
On the subsequent screen you can specify Version and Build numbers for the assigned Product to which the uploaded results belong. Also specify the SilkPerformer Test result status (e.g., Passed, Failed). Note: If any errors occured during the test run, the Test Result status will be set to Failed by default.
Click Finish to upload the results. Uploaded results appear in Test Manager on the Runs tab (Test Plan unit) in the Test Definition Runs column.
Related Concepts Working With SilkPerformer Projects Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Executing Attended SilkPerformer Tests Working with SilkPerformer Projects Executing Test Definitions Related Reference Execution Unit Interface
544
Click Execution on the workflow bar. Right-click a node within the Execution tree and select a collapse or expand option.
Related Concepts Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Executing Test Definitions Related Reference Execution Unit Interface
545
Click Execution on the workflow bar. Click the execution definition for which you are assigning a setup or cleanup test definition. Click the Setup/Cleanup tab.
To define a setup test definition, proceed with the following step. To define a cleanup test definition, proceed with step 7.
4 5 6 7 8 9
Click Edit in the Setup Test Definition portion of the tab. The Edit Setup Test Definition dialog box displays. Browse through your projects test planning tree and select the test definition that is to serve as this execution definitions setup test definition. Click OK. The assigned test definition then displays in the Setup Test Definition list. Click Edit in the Cleanup Test Definition portion of the tab. The Edit Cleanup Test Definition dialog box displays. Browse through your projects test planning tree and select the test definition that is to serve as this execution definitions cleanup test definition. Click OK. The assigned test definition now displays in the Cleanup Test Definition list.
Related Concepts Setup and Cleanup Test Definitions Execution Definitions Test Definition Execution Related Procedures Managing Test Executions - Quick Start Task Executing Test Definitions Related Reference Execution Setup/Cleanup tab
546
Click Execution on the workflow bar. Create an execution definition using a data-driven test definition. Note: When a test definition is specified as having each data row as a single test definition, the execution definition includes a separate test definition for each data row. To create an execution definition with only a selection of data-driven test definitions, you need to assign test definitions with the filter option. See the related concept for details.
In the Execution unit, select an execution definition that is based on a data-driven test definition. Click Activities in the workflow bar. Click the Run ID of the relevant execution definition. In the Assigned Test Definitions table, click the name of a data-driven test definition. Note: If you are running a multiple data-driven test, you will see one test definition for each data row in your data source.
The results page for that particular test definition opens. Select the Data Driven tab. Here you can view all instances of the test definition that were executed. Note: The test definitions data-driven properties are listed on the Details tab in the Data Driven Properties table.
Click an instance name to view test-definition run details for that specific instance. Note: If you are working with multiple data-driven test definition instances, a separate instance will be created for each data row in your data source.
Click the Parameters tab to view the data source values that were used during this specific test run.
547
Related Concepts Data-Driven Tests Execution Definitions Test Definition Execution SilkTest Test Definitions Automated Execution of SilkTest Test Definitions Related Procedures Managing Test Executions - Quick Start Task Creating Test Definitions Working with Data-Driven Tests Executing Test Definitions Related Reference Execution Unit Interface Execution Assigned Test Definitions Tab
548
Managing Issues
This section explains how to manage issues with SilkCentral Issue Manager. In This Section Tracking Issues This section explains how to track issues with SilkCentral Issue Manager. Working with Issues This section explains how to work with issues in Test Manager.
549
Tracking Issues
This section explains how to track issues with SilkCentral Issue Manager. In This Section Viewing Issue Statistics in Details View Describes how to view issue statistics in the Details View. Viewing Issue Statistics in Document View Describes how to view issue statistics in Document View.
550
Click Issues on the navigation tree. Click Details View on the toolbar. Select the tree node (project, issue-tracking system, or product) for which you want to view statistics. The calendar feature enables you to specify the time period over which you want to view issue statistics. Click the time-frame dates link to expand the calendar. Using the calendars From and To list boxes, specify start and end times for issue statistics. Click Update to update the chart view based on the specified time range.
Related Concepts Issue Management SilkCentral Issue Manager Test Definition Execution Upload Manager Related Procedures Managing Test Executions - Quick Start Task Tracking Issues Working with Issues Executing Test Definitions Related Reference Issues Unit Interface Execution Unit Interface Calendar Tool
551
Click the Issues link on the menu tree. Click Document on the toolbar. Select the tree node (project, issue-tracking system, or product) for which you want to view statistics.
Related Concepts Issue Management SilkCentral Issue Manager Test Definition Execution Upload Manager Related Procedures Managing Test Executions - Quick Start Task Tracking Issues Working with Issues Executing Test Definitions Related Reference Issues Unit Interface Execution Unit Interface
552
553
Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select the node of the test definition for which you want to assign an external issue. Select the Issues tab. Click Assign External Issue to open the Assign External Issue dialog box. Select the profile of the pre-configured, external issue-tracking system where the issue is tracked. In the External ID field, manually enter the unique alpha-numeric ID of an existing issue in the external issuetracking system. Click OK.
Related Concepts Issue Management Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Working with Issues Managing Test Plans Related Reference Test Plan Issues Page Issues Unit Interface
554
Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select the node of the test definition for which you want to create a new issue. Select the Issues tab. Click New Issue to open the New Issue dialog box. Select the profile of the issue-tracking system you are submitting the issue to. The Profile list box shows the internal (always available) and any external issue-tracking profiles that you may have created. Select Internal to save the issue only to the Test Manager database. Select an external profile to have the new issue saved to both the external tool and Test Manager. Note: Note: The profile you select here becomes the default selection for when you enter new issues in the future. When adding a new issue to an external tracking system, you will be prompted to provide login credentials for the external system. The credentials that you provide will be automatically preselected for you in the future. If you do not provide credentials, the default credentials stored in the profile will be used.
2 3 4
Enter a brief Synopsis of the issue. Enter a meaningful Description of the issue. Specify the Status of the issue (Open, Fixed, Verified, Deferred, Closed). When using an external profile, status is set by the external tool. Specify the ExternID of the corresponding issue in the external issue-tracking profile. Note: The ExternID is the corresponding issue ID in the external tool. This option is disabled when you have specified an external issue-tracking profile because the external tool sets this value. When the Internal profile is selected, this value can be set manually.
6 7
Specify the ExternLink of the issue-tracking profile. Note: The ExternLink is the HTTP link to the issue in the external tool. This option is disabled when you have specified an external issue-tracking profile because the external tool sets this value (when the tool offers direct HTTP links to issues, as is the case with Issue Manager). When a link is specified, the ExternID is shown as a link in the issue list. Depending on the issue-tracking profile you are working with, the New Issue dialog box may include other tracking fields that are specific to the external issue-tracking tool.
Note:
9
Click Save to save the new issue. Note: Issue Manager determines the ID numbers of newly created issues.
555
Related Concepts Issue Management Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Working with Issues Managing Test Plans Related Reference Test Plan Issues Page Issues Unit Interface
556
Deleting Issues
To delete an issue from the issue-tracking system
1
Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select the node of the test definition for which you want to delete an issue. Select the Issues tab. Click the delete icon of the issue you want to delete. Click Yes on the Delete Issue dialog box to confirm the deletion. External issues are not affected when internal issues are deleted.
2 3 4 5
Related Concepts Issue Management Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Working with Issues Managing Test Plans Related Reference Test Plan Issues Page
557
Click Issues on the navigation tree. Select an issue in the menu tree. Select the Details view. Click in the top-left corner of the tab view to open the calendar. .
Specify the From and To date/time for which you want to view issues by clicking the respective
Click Update to refresh the tab view with the issue listings that fall within the time frame you have specified.
558
Click Test Plan on the workflow bar. Click Test Plan View on the toolbar. Select the node of the test definition for which you are updating a corresponding external issue. Select the Issues tab. Click Update states of external Issues to synchronize the state of the issues listed in Test Manager with the corresponding issues in the external tool.
Related Concepts Issue Management Test Plan Management Upload Manager Related Procedures Managing Test Plans - Quick Start Task Working with Issues Managing Test Plans Related Reference Test Plan Issues Page Issues Unit Interface
559
Managing Projects
This section explains how to manage projects in Test Manager. In This Section Managing Folders This section explains how to manage folders in Test Manager. Creating Build Information Files How to create a dedicated file that contains appropriate build information for a Test Manager project. Selecting Projects How to select projects in Test Manager.
560
Managing Folders
This section explains how to manage folders in Test Manager. In This Section Copying Folders How to copy a folder. Cutting Folders How to cut a folder. Deleting Folders How to delete a folder. Editing Folders How to edit a folder. Pasting Folders How to paste folders. Pasting Folders as Child Folders How to paste folders as child folders. Sorting Folders How to sort folders. Adding Folders How to add a new folder.
561
Copying Folders
To copy a folder:
1
To copy a test executions folder, click Execution on the workflow bar. To copy a reports folder, click Reports on the workflow bar. Select a folder in the Reports/Execution tree. Click Copy on the toolbar to add a copy of the folder and its contents to the clipboard.
2 3
Related Concepts Successful Test Management Related Procedures Managing Folders Managing Projects Related Reference Projects Unit Interface
562
Cutting Folders
Cutting a folder differs from deleting a folder in that the folder and its contents are saved to the clipboard for subsequent pasting.
To cut a folder:
1
To cut a test executions folder, click Execution on the workflow bar. To cut a reports folder, click Reports on the workflow bar. Select a folder in the Reports/Execution tree. Click Cut on the toolbar to move the folder and its contents to the clipboard. Note: When you copy a folder, all sub-folders and reports contained within the folder are displayed in blue italics. Elements remain in this state until you select a new location and click Paste on the toolbar (or until you right-click an element within the cut group and select Undo Cut ).
2 3
Related Concepts Successful Test Management Related Procedures Managing Folders Managing Projects Related Reference Projects Unit Interface
563
Deleting Folders
To delete a folder:
1
To delete a test executions folder, click Execution on the workflow bar. To delete a reports folder, click Reports on the workflow bar. Select a folder in the Reports/Execution tree. Click Delete on the toolbar. On the confirmation dialog box, click OK to permanently delete the folder.
2 3 4
Related Concepts Successful Test Management Related Procedures Managing Folders Managing Projects Related Reference Projects Unit Interface
564
Editing Folders
To edit an existing folder:
1
To edit a test executions folder, click Execution on the workflow bar. To edit a reports folder, click Reports on the workflow bar. Select a folder in the Reports/Execution tree. Click Edit Folder on the toolbar. On the Edit Folder dialog box, edit the Name and Description of the folder. Check the Share this folder with other users check box if you want to make this folder available to other users. Click OK to accept your changes.
2 3 4 5 6
Related Concepts Successful Test Management Related Procedures Managing Folders Managing Projects Related Reference Projects Unit Interface
565
Pasting Folders
To paste a folder:
1
To paste a test executions folder, click Execution on the workflow bar. To paste a reports folder, click Reports on the workflow bar. Select an existing node (report, execution definition, or other folder) in the Reports/Execution tree where you want the copied folder to appear. Click Paste on the toolbar. The folder will appear on the same node level as the destination node you select.
2 3 4
Related Concepts Successful Test Management Related Procedures Managing Folders Managing Projects Related Reference Projects Unit Interface
566
To paste a test execution folder as a child folder, click Execution on the workflow bar. To paste a reports folder as a child folder, click Reports on the workflow bar. Select an existing node (report, execution definition, or other folder) in the Reports/Execution tree where you want the copied folder to appear. Click Paste as child on the toolbar. The folder will appear as a sub-node of the selected node.
2 3 4
Related Concepts Successful Test Management Related Procedures Managing Folders Managing Projects Related Reference Projects Unit Interface
567
Sorting Folders
To move a folder up or down within the Reports and Execution trees
1
To move a test executions folder, click Execution on the workflow bar. To move a reports folder, click Reports on the workflow bar. Select the folder you want to move. Click either Move Up or Move Down on the toolbar.
2 3
Related Concepts Successful Test Management Related Procedures Managing Folders Managing Projects Related Reference Projects Unit Interface
568
Adding Folders
To add a new folder:
1
To add a test executions folder, click Execution on the workflow bar. To add a reports folder, click Reports on the workflow bar. Select an existing node (report, execution definition, or other folder) in the tree where you want the new folder to display. The folder will display as a sub-node of the selected folder level. Click Add Folder on the toolbar. The New Folder dialog box displays. Specify a Name and Description for the folder. Check the Share this folder with other users check box if you want to make this folder available to other users. Click OK to create the folder.
2 3 4 5 6
Related Concepts Successful Test Management Related Procedures Managing Folders Managing Projects Related Reference Projects Unit Interface
569
On both the application and execution servers, navigate to: C:\Documents and Settings\All Users\Application Data\Borland\SCC32\BuildInfos Create a build info file for your project based on the template file BuildInfoExample.xml (shown in the following and available at the previous path).
<?xml version="1.0" encoding="utf-8"?><ProjectBuildInfo><BuildEntryList> <BuildEntry name="Demo Product"> <Version>3.1</Version> <BuildNr>350</BuildNr> </BuildEntry> name="Product2"> <Version>4.2</Version> <BuildNr>613</BuildNr> </BuildEntry> BuildEntryList></ProjectBuildInfo>
<BuildEntry </
Note:
To improve the structure of build information files, an element called BuildEntryList which contains a list of BuildEntry elements has been created. BuildEntry tags refer to specific products that are defined by the name attribute of BuildEntry elements.
Version (used on both application and execution servers): Version number currently available for testing
(not necessarily the same for each execution server). necessarily the same for each execution server).
BuildNr (used on both application and execution servers): Build number currently available for testing (not
4
Distribute the build information file to the execution servers: C:\Documents and Settings\All Users\Application Data\Borland\SCC32\BuildInfos Note: When stored on both the application server and execution servers, build information files must have the exact same name.
Once you have created the build information files on the application server and each execution server, you must specify the file name in the settings of the corresponding project. Select the Projects unit to view the list of projects assigned to you. Select the project to which you want to link the build information. Note: This must be done before the scheduling of any test definitions for the project. Otherwise previously scheduled test definitions will not be updated.
6 7
Select the Project Settings tab. Click Edit to edit the project settings of the selected project on the Edit Project Settings dialog box. Specify the name of the previously created XML file in the Build information file name field. Click OK to update the information. With all future test executions, Test Manager will read build information from the corresponding file and match test results with that information.
570
Related Concepts Build Information Build Information Updates Successful Test Management Related Procedures Managing Projects Managing a Successful Test Related Reference Projects Unit Interface
571
Selecting Projects
To select a project
1 2
Navigate to Projects
Projects.
Related Concepts Test Definition Execution Related Procedures Managing Projects Executing Test Definitions Related Reference Projects tab
572
Managing Activities
This section explains how to manage upcoming, current, and recently-executed test runs. In This Section Deleting Last Executions Runs How to delete a run from the Last Executions list. Displaying/Hiding Columns on the Activities Page Describes how to display/hide columns on the Activities page. Entering Issues From the Activities Tab Explains how to enter issues from the Activities tab. Filtering Test Runs on the Activities Page Describes how to filter test results and execution definitions on the Activities page. Grouping Test Runs on the Activities Page Describes how to group execution definitions or test results for easier viewing on the Activities page. Removing Activities Filters Explains how to remove filters that have been applied to columns on the Activities page. Reordering Columns on the Activities Page Describes how to reorder columns on the Activities page. Resizing Columns on the Activities Page Describes how to change the width of columns on the Activities page. Restoring Default Activities Page View Settings Explains how to restore the default view settings on the Activities page. Sorting Test Runs on the Activities Page Describes how to sort test runs on the Activities page.
573
Click Activities on the workflow bar. In the Last Executions area of the Activities tab, right-click the test run you want to delete and select Delete Run Results. A dialog box displays, asking you to confirm the deletion. Click OK.
Related Concepts Project Management Test Definition Execution Related Procedures Managing Projects Executing Test Definitions Related Reference Activities Page
574
Click Activities on the workflow bar. Right-click a column header. Expand the Columns submenus to view all the columns that are available in the project. Select the checkboxes of all the columns that you want to have displayed. Your column-display preferences will be saved and displayed each time you open the active project.
Related Concepts Project Management Test Definition Execution Related Procedures Managing Projects Executing Test Definitions Related Reference Activities Page
575
Click Activities on the workflow bar. In the Last Executions area, click the Run ID of the relevant execution definition to view test-execution results. Each test definition associated with the execution run is listed in the Assigned Test Definitions table at the bottom of the view. Click Create a new issue for this test definition (in the Actions column) of the test definition to which you want to associate the issue. Proceed with defining the issue.
Related Concepts Project Management Test Definition Execution Related Procedures Managing Projects Executing Test Definitions Related Reference Activities Page
576
Click Activities on the workflow bar. Right-click the header of the text-based column that the filter is to be based on. Expand the Filter submenu on the context menu to display the Filter text box. Enter a text string into the text box. Press ENTER. All entries that match the filter criteria (for example, in the case of execution-definition names, all executiondefinition names that include the specified string) are then dynamically displayed in the filtered list.
Click Activities on the workflow bar. Right-click the header of the date-based column that the filter is to be based on. Hold your cursor over Filter on the context menu to display the Before, After, or On submenu. Hold your cursor over After to define a date before which (and including) all entries should be excluded. Hold your cursor over Before to define a date after which (and including) all entries should be excluded. Hold your cursor over On to exclude all entries except those that have the specified date. The calendar tool displays. Select a date using the calendar tool (or click Today to specify today's date). Tip: You must explicitly click a date on the calendar tool or press ENTER to activate date-based filtering changes.
All entries that match the filter criteria are then dynamically displayed in the filtered list.
Click Activities on the workflow bar. Right-click the header of the number-based column that the filter is to be based on. Expand the Filter submenu on the context menu to display the > (greater than), < (less than), and = (equals) operators. Enter a number in the > text box to define a number less than which (and including) all entries should be excluded. Enter a number in the < text box to define a number greater than which (and including) all entries should be excluded. Enter a number in the = text box to exclude all entries except those that have the specified number. Note: Number values are rounded to two decimal places.
Press ENTER.
577
All entries that match the filter criteria are then dynamically displayed in the filtered list.
Click Activities on the workflow bar. Right-click the header of the Boolean-based column that the filter is to be based on. Expand the Filter submenu on the context menu to display the available values. Click one of the Yes or No option buttons. All entries that match the filter criteria are then dynamically displayed in the filtered list.
Click Activities on the workflow bar. Right-click the header of the column that has a predefined filter value that the filter is to be based on. Expand the Filter submenu on the context menu to display the available values. Check the check boxes of the filter values that you are interested in. All entries that have one of the selected criteria will be displayed.
Related Concepts Project Management Test Definition Execution Related Procedures Managing Projects Executing Test Definitions Related Reference Activities Page
578
Click Activities on the workflow bar. Right-click the header of the column that the sort is to be based on. Select Group by This Field. Entries are then organized into groups based on commonly-shared values within the column you have selected.
To remove grouping:
1 2 3
Click Activities on the workflow bar. Right-click any column. Uncheck the Show in Groups check box.
Related Concepts Project Management Test Definition Execution Related Procedures Managing Projects Executing Test Definitions Related Reference Activities Page
579
Click Activities on the workflow bar. Note: You can identify filtered columns by their titles, which are displayed in bold, italic text.
2 3
Right-click the header of the column that has the filter you want to remove. Uncheck the Filter check box.
Click Activities on the workflow bar. Right-click any column header. Select Reset Filters.
Related Concepts Project Management Test Definition Execution Related Procedures Managing Projects Executing Test Definitions Working with Filters Related Reference Activities Page
580
Click Activities on the workflow bar. Select the column header of the column you want to move. Drag the column to the desired position and release it. Your column-order preferences will be saved and displayed each time you open the active project.
Related Concepts Project Management Test Definition Execution Related Procedures Managing Projects Executing Test Definitions Related Reference Activities Page
581
Click Activities on the workflow bar. Select the vertical column-header divider of the column you want to adjust. Drag the column boundary to the desired position and release it. Your column-width preferences will be saved and displayed each time you open the active project.
Related Concepts Project Management Test Definition Execution Related Procedures Managing Projects Executing Test Definitions Related Reference Activities Page
582
Click Activities on the workflow bar. Right-click any column header. Select Reset View.
Related Concepts Project Management Test Definition Execution Related Procedures Managing Projects Executing Test Definitions Related Reference Activities Page
583
Click Activities on the workflow bar. Right-click the header of the column you want the test runs to be sorted by. Select Sort Ascending to have the test runs sorted in ascending order (or select Sort Descending to have the test runs sorted in descending order). Your sort preferences will be saved and displayed each time you open the active project.
Related Concepts Project Management Test Definition Execution Related Procedures Managing Projects Executing Test Definitions Related Reference Activities Page
584
Managing Reports
This section explains how to work with reports in Test Manager. In This Section Creating Reports This section explains how to create reports in Test Manager. Customizing Reports with BIRT This section explains how to customize Test Manager reports using BIRT. Generating Reports This section explains how to generate reports in Test Manager. Adding Subreports Describes how to add subreports. Deleting Subreports Describes how to delete a subreport. Displaying Charts Describes how to display a chart. Accessing MRU (Most Recently Used) Reports Explains how to access a recently-viewed report. Editing Report Parameters Describes how to edit report parameters. Editing Report Properties Describes how to edit report properties. Printing Charts Describes how to print charts. Removing Charts Describes how to remove charts.
585
Creating Reports
This section explains how to create reports in Test Manager. In This Section Creating New Reports How to create a new report. Writing Advanced Queries with SQL How to write advanced SQL queries for reporting.
586
Click Reports on the workflow bar. In the Reports directory tree, select the folder in which you want the new report to appear. This determines where the report will be stored in the directory tree. Click New Child Report on the toolbar. On the Create New Report dialog box, enter the name of the new report. This is the name that will appear in the Reports tree. Check the Share this report with other users check box if you want to make this report available to other users.
3 4 5
In the Timeout [s] field, specify the maximum time period in seconds that Test Manager should wait for SQL queries to complete. From the Default tab list box, select the tab that you want to be directed to when you select this report from one of the context-sensitive report lists. Select the corresponding result type from the Result Category list box. This setting specifies the database table and view that is to be filtered for the report. The following result types are available:
Requirement - Returns requirements available in the requirements module that meet the query criteria. Test Definition - Returns test definitions available in the Test Plan module that meet the query
criteria.
Test Definition Execution - Returns executed test definition results from the Executions module
that meet the query criteria.
Execution Definition - Returns execution definitions from the execution module. Issue - Returns issues (including imported issues). Requirement Progress Builds - Contains information on requirements progress per build so that
you can see how requirements develop across builds. on a daily basis.
Requirement Progress Days - The same as Requirement Progress Builds, but shows development Test Definition Progress Builds - Shows how test definitions develop across builds. Test Definition Progress Days - Same as above, but shows development on a daily basis.
Each result type offers a set of selection criteria. Based on the Result Type you have selected, specify an appropriate Selection Criteria for your report. These criteria typically group properties based on a view or some other intuitive grouping (for example, custom properties).
6 7
From the Property list box, select the property that is to be filtered on. For some selection criteria, properties are dynamic. Select an Operator for the query. The available operators depend on the property. Example operators are =, not, like, not like. Strings are always compared lowercase. Allowed wildcards for strings are * and ? (where * matches any characters and ? matches exactly one character). Select or specify the Value that the query is to be filtered on. For date-based properties, the Value field is replaced with a calendar tool that you can use to select a specific date. (optional) To add an additional query string to this report, click More. An existing query string can be deleted by clicking the strings Delete button. When multiple query strings are defined, AND and OR radio buttons appear next to the More button. Use these option buttons to define if the queries should be considered cumulatively (ALL), or if only one query strings criteria needs to be met (OR).
587
Click Next to configure report columns on the New Report dialog box.
To create columns:
1
Click Add Columns to display the Add Columns dialog box. All available report columns are listed. Select those that you want to have included in the report and click OK (multiple columns can be selected by holding down the CTRL key). Note: For test-planning reports, the list of available column names is enhanced with the column names from the LQM_v_tests table. See SilkCentral database documentation for full details.
The selected columns appear in tabular format on the New Report dialog box. From here you can configure how each report column is to be displayed. For each column, specify a sort direction (ascending, descending, or unsorted) using the up/down arrows in the Sorting column. When a column is selected for sorting, a list box is displayed in the Sort Order column that allows you to more easily edit the column-sort order. Set these numbers as required. Give each column an Alias. This is the name by which each column will be labeled in the generated report. With grouping, you can take advantage of SQL aggregation features (for example, selecting a number of elements or querying a total sum of values). Check the Group by check box on the column selection dialog to specify that SQL group by functions are to be applied. Columns that are not selected for SQL group by functions are set to aggregation by default (meaning, a single aggregate value will be calculated). From the Aggregation list box, select the appropriate aggregation type (Count, Sum, Average, Minimum, or Maximum). The Actions column enables you to move column listings up and down in the view. The Move Up and Move Down functions do not affect the outcome of the report. Note: Any report column can be deleted by clicking the columns Delete button.
3 4 5
Related Concepts New Reports Report Generation Related Procedures Analyzing Test Results - Quick Start Task Creating Reports Customizing Reports with BIRT Generating Reports Using Context-Sensitive Reports Managing Reports Related Reference Reports Unit Interface
588
Click Reports on the workflow bar. In the Reports directory tree, select the folder in which you want the new report to appear (Requirements, Test Plan, Issues, etc.). This determines where the report will be stored in the directory tree. Click Create New Report on the toolbar. On the Create New Report dialog box, enter the name of the new report. This is the name that will appear in the Reports tree. Check the Share this report with other users check box if you want to make this report available to other users. Enter a description of the report in the Description field. Click Advanced to open the Report data query field. Insert previously written code as necessary, or write new code directly in the field. To assist you in writing SQL queries, a list box of Test Manager function placeholders is available. See the following section for details regarding available placeholders. To insert one of the available pre-defined functions, select the corresponding placeholder from the Insert placeholder list box. Note: If you manually edit SQL code for the query, click Check SQL to confirm your work.
Once you have completed editing the reports properties, click Finish to save your settings.
Related Concepts New Reports Report Generation Related Procedures Analyzing Test Results - Quick Start Task Creating Reports Customizing Reports with BIRT Generating Reports Using Context-Sensitive Reports Managing Reports Related Reference Reports Unit Interface
589
590
Select a report that utilizes the BIRT Report Template from Test Manager Select the Properties tab. Click Download BIRT Report Template.
You receive the report data as a generic BIRT report template (empty). The datasource is already configured. Once you have saved the template to your local system, modify it as required. Once complete, upload it using the Upload link on the Report tab. For detailed information on configuring BIRT report templates, please refer to the SilkCentral Administration Module Help.
Related Concepts New Reports Report Generation Related Procedures Analyzing Test Results - Quick Start Task Creating Reports Customizing Reports with BIRT Generating Reports Managing Reports Related Reference Report Properties tab
591
Select a report that utilizes the template you want to modify from Test Manager Select the Properties tab.
Click the download link of the template you want to download. The available download links are:
Download Excel Report Template Download BIRT Report Template Download as CSV Download as XML
Here are details about the available template formats:
MS Excel - You receive an MS Excel file with a sheet named DATA that contains the data (for example, BIRT report template - You receive the report data as a generic BIRT report template (empty). The
datasource is already configured.
in CSV format). This is the only affected sheet in the template, so you can specify information in adjoining sheets (for example, diagrams).
CSV (Comma Separated Values) - You receive the report data as a CSV file. Depending on your local
settings, you will receive , or ; as the delimiter character. The date is also formatted based on user settings.
XML You receive the report data as XML. The advantage of this approach over CSV is that you retain
all subreport data. Accessing data outside of Test Manager - You can call a specific URL that offers the report data using the following format: http://server/servicesExchange? hid=reportData&userName=<username>&passWord=<password>&reportFilterID=<ID of the report>&type=<csv|xml>
4 5
The File Download dialog box displays. Click Save and download the report file to your local system as a .rptdesign or .xls file, depending on the report type that you are downloading. Now edit the report based on your needs using either BIRT RCP Designer (for .rptdesign files) or Excel (for .xls files).
592
Related Concepts New Reports Report Generation Related Procedures Analyzing Test Results - Quick Start Task Uploading Report Templates Creating Reports Customizing Reports with BIRT Generating Reports Managing Reports Related Reference Report tab
593
Generating Reports
This section explains how to generate reports in Test Manager. In This Section Using Context-Sensitive Reports This section explains how to enable and access context-sensitive reports. Removing Report Templates Describes how to remove the current report's template. Saving Reports Describes how to save a report. Uploading Report Templates Describes how to upload a template from your local system. Viewing a Report as a PDF Describes how to view a report in PDF format. Viewing Reports Describes how to generate a report.
594
595
596
Click Executions on the workflow bar to go to the Executions unit. Right-click an execution definition in the Executions menu tree and choose Reports. Select a report from the Reports sub-menu. You are then taken to the report's Parameters tab in the Reports unit where the execution-definition's ID is pre-populated as a value. Note: You can configure this destination-tab linking behavior using each report's Edit Report dialog box.
4 5
Edit the report's parameters as required. Advance to the report's Data, Report, or Chart tab to complete configuration of the report.
Click Executions on the workflow bar to go to the Executions unit. Click the Runs tab. Right-click a run and choose Reports. Select a report from the Reports sub-menu. You are then taken to the report's Parameters tab in the Reports unit where the run's ID is pre-populated as a value. Note: You can configure this destination-tab linking behavior using each report's Edit Report dialog box.
5 6
Edit the report's parameters as required. Advance to the report's Data, Report, or Chart tab to complete configuration of the report.
Related Concepts Context-Sensitive Reports Related Procedures Accessing Context-Sensitive Reports Writing Advanced Queries with SQL Enabling Context-Sensitive Reports Creating New Reports
597
Click Requirements on the workflow bar to go to the Requirements unit. Right-click a requirement in the Requirements menu tree and choose Reports. Select a report from the Reports sub-menu. You are then taken to the report's Parameters tab in the Reports unit where the requirement's ID is prepopulated as a value. Note: You can configure this destination-tab linking behavior using each report's Edit Report dialog box.
4 5
Edit the report's parameters as required. Advance to the report's Data, Report, or Chart tab to complete configuration of the report.
Related Concepts Context-Sensitive Reports Related Procedures Accessing Context-Sensitive Reports Enabling Context-Sensitive Reports Writing Advanced Queries with SQL Creating New Reports
598
Click Test Plan on the workflow bar to go to the Test Plan unit. Right-click a test definition in the Test Plan menu tree or the Test Plan Grid View and choose Reports. Note: When multi-selecting test definitions in the test plan Grid View, the context-sensitive reporting is disabled.
Select a report from the Reports sub-menu. You are then taken to the report's Parameters tab in the Reports unit where the test-definition's ID is prepopulated as a value. Note: You can configure this destination-tab linking behavior using each report's Edit Report dialog box.
4 5
Edit the report's parameters as required. Advance to the report's Data, Report, or Chart tab to complete configuration of the report.
Related Concepts Context-Sensitive Reports Related Procedures Accessing Context-Sensitive Reports Writing Advanced Queries with SQL Enabling Context-Sensitive Reports Creating New Reports
599
600
To enable a simple report to appear in context-sensitive report lists in the execution tree or the runs tab:
1
Complete the steps involved in creating a simple report (detailed in the procedure linked below), though alter the procedure with the following two steps. Creating New Reports Select Execution Definition from the Result category list box. Select the selection criteria for the context-sensitive report:
2 3
Select Execution Definition Property from the Selection criteria list box. Select Execution Definition Run from the Selection criteria list box.
4 5 6
Select ID from the Property list box. Enter a value in the Value field (for example, the ID number of an existing execution definition or an existing execution definition run). Click Finish.
To enable an advanced report to appear in context-sensitive report lists in the execution tree or on the runs tab:
1
An execution-definition ID as an input parameter for the report to appear in the execution tree. An execution-definition-run ID as an input parameter for the report to appear on the runs tab.
To do this, complete the steps involved in creating an advanced query (detailed in the procedure linked below), though alter the procedure with the following steps. Writing Advanced Queries with SQL
2
To make an advanced query available in the Executions context menu, insert the parameter name execProp_Id_0 as input for ExecDef_ID_pk_fk. For example, your report's SQL statement may have defined a hard-coded database-column value, such as ExecDef_ID_pk_fk = 68. To edit this report so that it receives column-name values dynamically, replace the static value of 68 with the following notation: $ {execProp_Id_0 | 68}
Note: Consult SilkCentral Test Manager database documentation to find out additional information about tables and column-name definitions.
601
Related Concepts Context-Sensitive Reports Related Procedures Accessing Context-Sensitive Reports Enabling Context-Sensitive Reports
602
Click Reports on the workflow bar. In the Reports directory tree, select the folder in which you want the new report to appear. This determines where the report will be stored in the directory tree. Click New Child Report on the toolbar. On the Create New Report dialog box, enter the name of the new report. This is the name that will appear in the Reports tree. Check the Share this report with other users check box if you want to make this report available to other users.
3 4 5
In the Timeout [s] field, specify the maximum time period in seconds that Test Manager should wait for SQL queries to complete. From the Default tab list box, select the tab that you want to be directed to when you select this report from one of the context-sensitive report lists. Select the corresponding result type from the Result Category list box. This setting specifies the database table and view that is to be filtered for the report. The following result types are available:
Requirement - Returns requirements available in the requirements module that meet the query criteria. Test Definition - Returns test definitions available in the Test Plan module that meet the query
criteria.
Test Definition Execution - Returns executed test definition results from the Executions module
that meet the query criteria.
Execution Definition - Returns execution definitions from the execution module. Issue - Returns issues (including imported issues). Requirement Progress Builds - Contains information on requirements progress per build so that
you can see how requirements develop across builds. on a daily basis.
Requirement Progress Days - The same as Requirement Progress Builds, but shows development Test Definition Progress Builds - Shows how test definitions develop across builds. Test Definition Progress Days - Same as above, but shows development on a daily basis.
Each result type offers a set of selection criteria. Based on the Result Type you have selected, specify an appropriate Selection Criteria for your report. These criteria typically group properties based on a view or some other intuitive grouping (for example, custom properties).
6 7
From the Property list box, select the property that is to be filtered on. For some selection criteria, properties are dynamic. Select an Operator for the query. The available operators depend on the property. Example operators are =, not, like, not like. Strings are always compared lowercase. Allowed wildcards for strings are * and ? (where * matches any characters and ? matches exactly one character). Select or specify the Value that the query is to be filtered on. For date-based properties, the Value field is replaced with a calendar tool that you can use to select a specific date. (optional) To add an additional query string to this report, click More. An existing query string can be deleted by clicking the strings Delete button. When multiple query strings are defined, AND and OR radio buttons appear next to the More button. Use these option buttons to define if the queries should be considered cumulatively (ALL), or if only one query strings criteria needs to be met (OR).
603
Click Next to configure report columns on the New Report dialog box.
To create columns:
1
Click Add Columns to display the Add Columns dialog box. All available report columns are listed. Select those that you want to have included in the report and click OK (multiple columns can be selected by holding down the CTRL key). Note: For test-planning reports, the list of available column names is enhanced with the column names from the LQM_v_tests table. See SilkCentral database documentation for full details.
The selected columns appear in tabular format on the New Report dialog box. From here you can configure how each report column is to be displayed. For each column, specify a sort direction (ascending, descending, or unsorted) using the up/down arrows in the Sorting column. When a column is selected for sorting, a list box is displayed in the Sort Order column that allows you to more easily edit the column-sort order. Set these numbers as required. Give each column an Alias. This is the name by which each column will be labeled in the generated report. With grouping, you can take advantage of SQL aggregation features (for example, selecting a number of elements or querying a total sum of values). Check the Group by check box on the column selection dialog to specify that SQL group by functions are to be applied. Columns that are not selected for SQL group by functions are set to aggregation by default (meaning, a single aggregate value will be calculated). From the Aggregation list box, select the appropriate aggregation type (Count, Sum, Average, Minimum, or Maximum). The Actions column enables you to move column listings up and down in the view. The Move Up and Move Down functions do not affect the outcome of the report. Note: Any report column can be deleted by clicking the columns Delete button.
3 4 5
Related Concepts New Reports Report Generation Related Procedures Analyzing Test Results - Quick Start Task Creating Reports Customizing Reports with BIRT Generating Reports Using Context-Sensitive Reports Managing Reports Related Reference Reports Unit Interface
604
Click Reports on the workflow bar. In the Reports directory tree, select the folder in which you want the new report to appear (Requirements, Test Plan, Issues, etc.). This determines where the report will be stored in the directory tree. Click Create New Report on the toolbar. On the Create New Report dialog box, enter the name of the new report. This is the name that will appear in the Reports tree. Check the Share this report with other users check box if you want to make this report available to other users. Enter a description of the report in the Description field. Click Advanced to open the Report data query field. Insert previously written code as necessary, or write new code directly in the field. To assist you in writing SQL queries, a list box of Test Manager function placeholders is available. See the following section for details regarding available placeholders. To insert one of the available pre-defined functions, select the corresponding placeholder from the Insert placeholder list box. Note: If you manually edit SQL code for the query, click Check SQL to confirm your work.
Once you have completed editing the reports properties, click Finish to save your settings.
Related Concepts New Reports Report Generation Related Procedures Analyzing Test Results - Quick Start Task Creating Reports Customizing Reports with BIRT Generating Reports Using Context-Sensitive Reports Managing Reports Related Reference Reports Unit Interface
605
Complete the steps involved in creating a simple report (detailed in the procedure linked below), though alter the procedure with the following two steps. Creating New Reports Select Requirement from the Selection criteria list box. Enter a value in the Value field (for example, the ID number of an existing requirement).
2 3
Create a report that includes a requirement ID as an input parameter. To do this, complete the steps involved in creating an advanced query (detailed in the procedure linked below), though alter the procedure with the following steps. Writing Advanced Queries with SQL To make an advanced query available in the Requirements context menu, insert the parameter name reqProp_Id_0 as input for Req_ID_pk_fk. For example, your report's SQL statement may have defined a hard-coded database-column value, such as Req_ID_pk_fk = 68. To edit this report so that it receives column-name values dynamically, replace the static value of 68 with the following notation: ${reqProp_Id_0 | 68}
Note: Consult SilkCentral Test Manager database documentation to find out additional information about tables and column-name definitions. Related Concepts Context-Sensitive Reports Related Procedures Accessing Context-Sensitive Reports Enabling Context-Sensitive Reports Writing Advanced Queries with SQL Creating New Reports
606
Click Reports on the workflow bar. In the Reports directory tree, select the folder in which you want the new report to appear. This determines where the report will be stored in the directory tree. Click New Child Report on the toolbar. On the Create New Report dialog box, enter the name of the new report. This is the name that will appear in the Reports tree. Check the Share this report with other users check box if you want to make this report available to other users.
3 4 5
In the Timeout [s] field, specify the maximum time period in seconds that Test Manager should wait for SQL queries to complete. From the Default tab list box, select the tab that you want to be directed to when you select this report from one of the context-sensitive report lists. Select the corresponding result type from the Result Category list box. This setting specifies the database table and view that is to be filtered for the report. The following result types are available:
Requirement - Returns requirements available in the requirements module that meet the query criteria. Test Definition - Returns test definitions available in the Test Plan module that meet the query
criteria.
Test Definition Execution - Returns executed test definition results from the Executions module
that meet the query criteria.
Execution Definition - Returns execution definitions from the execution module. Issue - Returns issues (including imported issues). Requirement Progress Builds - Contains information on requirements progress per build so that
you can see how requirements develop across builds. on a daily basis.
Requirement Progress Days - The same as Requirement Progress Builds, but shows development Test Definition Progress Builds - Shows how test definitions develop across builds. Test Definition Progress Days - Same as above, but shows development on a daily basis.
Each result type offers a set of selection criteria. Based on the Result Type you have selected, specify an appropriate Selection Criteria for your report. These criteria typically group properties based on a view or some other intuitive grouping (for example, custom properties).
6 7
From the Property list box, select the property that is to be filtered on. For some selection criteria, properties are dynamic. Select an Operator for the query. The available operators depend on the property. Example operators are =, not, like, not like. Strings are always compared lowercase. Allowed wildcards for strings are * and ? (where * matches any characters and ? matches exactly one character). Select or specify the Value that the query is to be filtered on. For date-based properties, the Value field is replaced with a calendar tool that you can use to select a specific date. (optional) To add an additional query string to this report, click More. An existing query string can be deleted by clicking the strings Delete button. When multiple query strings are defined, AND and OR radio buttons appear next to the More button. Use these option buttons to define if the queries should be considered cumulatively (ALL), or if only one query strings criteria needs to be met (OR).
607
Click Next to configure report columns on the New Report dialog box.
To create columns:
1
Click Add Columns to display the Add Columns dialog box. All available report columns are listed. Select those that you want to have included in the report and click OK (multiple columns can be selected by holding down the CTRL key). Note: For test-planning reports, the list of available column names is enhanced with the column names from the LQM_v_tests table. See SilkCentral database documentation for full details.
The selected columns appear in tabular format on the New Report dialog box. From here you can configure how each report column is to be displayed. For each column, specify a sort direction (ascending, descending, or unsorted) using the up/down arrows in the Sorting column. When a column is selected for sorting, a list box is displayed in the Sort Order column that allows you to more easily edit the column-sort order. Set these numbers as required. Give each column an Alias. This is the name by which each column will be labeled in the generated report. With grouping, you can take advantage of SQL aggregation features (for example, selecting a number of elements or querying a total sum of values). Check the Group by check box on the column selection dialog to specify that SQL group by functions are to be applied. Columns that are not selected for SQL group by functions are set to aggregation by default (meaning, a single aggregate value will be calculated). From the Aggregation list box, select the appropriate aggregation type (Count, Sum, Average, Minimum, or Maximum). The Actions column enables you to move column listings up and down in the view. The Move Up and Move Down functions do not affect the outcome of the report. Note: Any report column can be deleted by clicking the columns Delete button.
3 4 5
Related Concepts New Reports Report Generation Related Procedures Analyzing Test Results - Quick Start Task Creating Reports Customizing Reports with BIRT Generating Reports Using Context-Sensitive Reports Managing Reports Related Reference Reports Unit Interface
608
Click Reports on the workflow bar. In the Reports directory tree, select the folder in which you want the new report to appear (Requirements, Test Plan, Issues, etc.). This determines where the report will be stored in the directory tree. Click Create New Report on the toolbar. On the Create New Report dialog box, enter the name of the new report. This is the name that will appear in the Reports tree. Check the Share this report with other users check box if you want to make this report available to other users. Enter a description of the report in the Description field. Click Advanced to open the Report data query field. Insert previously written code as necessary, or write new code directly in the field. To assist you in writing SQL queries, a list box of Test Manager function placeholders is available. See the following section for details regarding available placeholders. To insert one of the available pre-defined functions, select the corresponding placeholder from the Insert placeholder list box. Note: If you manually edit SQL code for the query, click Check SQL to confirm your work.
Once you have completed editing the reports properties, click Finish to save your settings.
Related Concepts New Reports Report Generation Related Procedures Analyzing Test Results - Quick Start Task Creating Reports Customizing Reports with BIRT Generating Reports Using Context-Sensitive Reports Managing Reports Related Reference Reports Unit Interface
609
To have a simple report appear in the Test Plan context-sensitive report list:
1
Complete the steps involved in creating a simple report (detailed in the procedure linked below), though alter the procedure with the following two steps. Creating New Reports Select Test Definition from the Selection criteria list box. Enter a value in the Value field (for example, the ID number of an existing test definition).
2 3
To make an advanced report appear in the Test Plan context-sensitive report list:
1
Create a report that includes an test definition ID as an input parameter. To do this, complete the steps involved in creating an advanced query (detailed in the procedure linked below), though alter the procedure with the following steps. Writing Advanced Queries with SQL To make an advanced query available in the Test Plan context menu, insert the parameter name tdProp_Id_0 as input for TestDef_ID_pk_fk. For example, your report's SQL statement may have defined a hard-coded database-column value, such as TestDef_ID_pk_fk = 68. To edit this report so that it receives column-name values dynamically, replace the static value of 68 with the following notation: ${tdProp_Id_0 | 68}
Note: Consult SilkCentral Test Manager database documentation to find out additional information about tables and column-name definitions. Related Concepts Context-Sensitive Reports Related Procedures Accessing Context-Sensitive Reports Enabling Context-Sensitive Reports Writing Advanced Queries with SQL Creating New Reports
610
Click Reports on the workflow bar. In the Reports directory tree, select the folder in which you want the new report to appear. This determines where the report will be stored in the directory tree. Click New Child Report on the toolbar. On the Create New Report dialog box, enter the name of the new report. This is the name that will appear in the Reports tree. Check the Share this report with other users check box if you want to make this report available to other users.
3 4 5
In the Timeout [s] field, specify the maximum time period in seconds that Test Manager should wait for SQL queries to complete. From the Default tab list box, select the tab that you want to be directed to when you select this report from one of the context-sensitive report lists. Select the corresponding result type from the Result Category list box. This setting specifies the database table and view that is to be filtered for the report. The following result types are available:
Requirement - Returns requirements available in the requirements module that meet the query criteria. Test Definition - Returns test definitions available in the Test Plan module that meet the query
criteria.
Test Definition Execution - Returns executed test definition results from the Executions module
that meet the query criteria.
Execution Definition - Returns execution definitions from the execution module. Issue - Returns issues (including imported issues). Requirement Progress Builds - Contains information on requirements progress per build so that
you can see how requirements develop across builds. on a daily basis.
Requirement Progress Days - The same as Requirement Progress Builds, but shows development Test Definition Progress Builds - Shows how test definitions develop across builds. Test Definition Progress Days - Same as above, but shows development on a daily basis.
Each result type offers a set of selection criteria. Based on the Result Type you have selected, specify an appropriate Selection Criteria for your report. These criteria typically group properties based on a view or some other intuitive grouping (for example, custom properties).
6 7
From the Property list box, select the property that is to be filtered on. For some selection criteria, properties are dynamic. Select an Operator for the query. The available operators depend on the property. Example operators are =, not, like, not like. Strings are always compared lowercase. Allowed wildcards for strings are * and ? (where * matches any characters and ? matches exactly one character). Select or specify the Value that the query is to be filtered on. For date-based properties, the Value field is replaced with a calendar tool that you can use to select a specific date. (optional) To add an additional query string to this report, click More. An existing query string can be deleted by clicking the strings Delete button. When multiple query strings are defined, AND and OR radio buttons appear next to the More button. Use these option buttons to define if the queries should be considered cumulatively (ALL), or if only one query strings criteria needs to be met (OR).
611
Click Next to configure report columns on the New Report dialog box.
To create columns:
1
Click Add Columns to display the Add Columns dialog box. All available report columns are listed. Select those that you want to have included in the report and click OK (multiple columns can be selected by holding down the CTRL key). Note: For test-planning reports, the list of available column names is enhanced with the column names from the LQM_v_tests table. See SilkCentral database documentation for full details.
The selected columns appear in tabular format on the New Report dialog box. From here you can configure how each report column is to be displayed. For each column, specify a sort direction (ascending, descending, or unsorted) using the up/down arrows in the Sorting column. When a column is selected for sorting, a list box is displayed in the Sort Order column that allows you to more easily edit the column-sort order. Set these numbers as required. Give each column an Alias. This is the name by which each column will be labeled in the generated report. With grouping, you can take advantage of SQL aggregation features (for example, selecting a number of elements or querying a total sum of values). Check the Group by check box on the column selection dialog to specify that SQL group by functions are to be applied. Columns that are not selected for SQL group by functions are set to aggregation by default (meaning, a single aggregate value will be calculated). From the Aggregation list box, select the appropriate aggregation type (Count, Sum, Average, Minimum, or Maximum). The Actions column enables you to move column listings up and down in the view. The Move Up and Move Down functions do not affect the outcome of the report. Note: Any report column can be deleted by clicking the columns Delete button.
3 4 5
Related Concepts New Reports Report Generation Related Procedures Analyzing Test Results - Quick Start Task Creating Reports Customizing Reports with BIRT Generating Reports Using Context-Sensitive Reports Managing Reports Related Reference Reports Unit Interface
612
Click Reports on the workflow bar. In the Reports directory tree, select the folder in which you want the new report to appear (Requirements, Test Plan, Issues, etc.). This determines where the report will be stored in the directory tree. Click Create New Report on the toolbar. On the Create New Report dialog box, enter the name of the new report. This is the name that will appear in the Reports tree. Check the Share this report with other users check box if you want to make this report available to other users. Enter a description of the report in the Description field. Click Advanced to open the Report data query field. Insert previously written code as necessary, or write new code directly in the field. To assist you in writing SQL queries, a list box of Test Manager function placeholders is available. See the following section for details regarding available placeholders. To insert one of the available pre-defined functions, select the corresponding placeholder from the Insert placeholder list box. Note: If you manually edit SQL code for the query, click Check SQL to confirm your work.
Once you have completed editing the reports properties, click Finish to save your settings.
Related Concepts New Reports Report Generation Related Procedures Analyzing Test Results - Quick Start Task Creating Reports Customizing Reports with BIRT Generating Reports Using Context-Sensitive Reports Managing Reports Related Reference Reports Unit Interface
613
Click Reports on the workflow bar. In the Reports tree, select the report from which you want to delete a template. Click the Report tab. Click the reports Delete icon. Select Yes on the subsequent confirmation dialog box.
Related Concepts New Reports Report Generation Related Procedures Analyzing Test Results - Quick Start Task Creating Reports Customizing Reports with BIRT Generating Reports Managing Reports Related Reference Report Properties tab
614
Saving Reports
How you save a report locally depends on whether you have selected a BIRT report template or an Excel template. If you have selected an Excel template, simply click the Download link on the Report tab. This will invoke Microsoft Excel on your local computer and the report will be loaded automatically. If you have selected a BIRT report template, use the following procedure to save the report.
Click Reports on the workflow bar. In the Reports tree, select the report that you want to save. Click the Report tab. Click PDF on the Report view toolbar. On the File Download dialog box, click Save to save the PDF document to a location of your choice.
Related Concepts Report Generation Related Procedures Analyzing Test Results - Quick Start Task Customizing Reports with BIRT Generating Reports Managing Reports Related Reference Report tab
615
Click Reports on the workflow bar. Select the report to which you want to have the template associated. Select the Report tab. Click the Click here to upload a new report template link to open the Upload Template dialog box. Give the template a meaningful Name and Description. In the Projects field, select the project to which you would like to make the template available; or, select All Projects to have the template associated with all projects. Click Browse. Then browse to and select the template on your local system. Click OK to upload the template.
Related Concepts New Reports Report Generation Related Procedures Analyzing Test Results - Quick Start Task Downloading Report Templates Creating Reports Customizing Reports with BIRT Generating Reports Managing Reports Related Reference Report tab
616
Click Reports on the workflow bar. In the Reports tree, select the report that you want to view. Click the Report tab. Click PDF on the report view toolbar. The report then displays in PDF format.
Related Concepts Report Generation Related Procedures Analyzing Test Results - Quick Start Task Customizing Reports with BIRT Generating Reports Managing Reports Related Reference Report tab
617
Viewing Reports
Because each template expects a certain data format to produce a useful graph, not all templates can be applied to all report queries. You will receive an error message if you attempt to generate a report through an incompatible report template. For example, selecting the Four Values Per Row As Horizontal Bar template to display the Requirements Status Overview report works because this particular Microsoft Excel template requires exactly the four values (failed, passed, not executed, and not covered) that the report query delivers.
To generate a report
1 2 3 4 5 6 7
Click Reports on the workflow bar. In the Reports tree, select the report you want to generate. Select the Report tab. Click the Select Report Template icon. From the Select Report Template dialog box, select the template you wish to use. Click OK to display the report. (optional) If necessary, select an alternate view magnification for the report from the list box. 100% is the default magnification. Other options are 50%, 75%, 150%, and 200%.
Related Concepts Report Generation Related Procedures Analyzing Test Results - Quick Start Task Context-Sensitive Reports Generating Reports Managing Reports Related Reference Report tab
618
Adding Subreports
To aggregate the results from multiple reports into the currently selected report, you can add subreports. When adding a report as a subreport, the result columns and rows of the subreport are concatenated to the results of the selected report.
Click Reports on the workflow bar. Select a report in the Reports tree. On the Properties tab, click Add Subreport. The Add Subreport dialog box displays. Select the subreport you want to have appended to the current report by selecting it from the Reports tree-list. Click OK to complete the addition of the subreport. Subreports appear on the associated reports Properties tab in a section called Subreports.
Related Concepts Report Generation Related Procedures Creating New Reports Customizing BIRT Report Templates Creating Reports Generating Reports Managing Reports Related Reference Report tab
619
Deleting Subreports
To delete a subreport
1 2 3 4
Click Reports on the workflow bar. Select the report in the Reports tree that has the associated subreport that you want to delete. On the Properties tab, click the Delete icon (in the Action column of the Subreports table) of the subreport you want to delete. Click Yes on the confirmation dialog box to confirm the deletion.
Related Concepts Report Generation Related Procedures Creating New Reports Customizing BIRT Report Templates Creating Reports Generating Reports Managing Reports Related Reference Report tab
620
Displaying Charts
To display a chart
1 2 3 4 5 6 7
Click Reports on the workflow bar. Select a report in the Reports tree for which you want to view a chart. Select the Chart tab to display the default chart. To select a required chart type, click the Select Chart Type icon. On the Select Chart Type dialog box, select a chart type. Select the view properties that you want to apply to the chart (3D view, Show horizontal grid lines, Show vertical grid lines, and Show legend). Specify how these chart options are to be saved:
Select For current user only to have these chart settings override the reports standard settings whenever
the current user views this chart.
Select As report standard to have these chart settings presented to all users who dont have overriding
user settings defined. This setting does not affect individual user settings.
8
Click OK to display the new chart type. Note: Note: The chart configurations you define here become the defaults for this report. When standard charts and graphs are not able to deliver the specific data that you require, or when they cannot display data in a required format, you can customize the appearance of queried data using the Test Manager reporting functionality. To open the current chart in a separate browser window, click the Open in new window icon at the top of the Chart tab.
Note:
Related Concepts Report Generation Related Procedures Customizing BIRT Report Templates Displaying Charts Creating Reports Generating Reports Managing Reports Related Reference Report Chart tab
621
Click Reports on the workflow bar. Expand the Last Used Reports list box on the Reports toolbar. Select the report that you want to view.
Related Concepts Report Generation Related Procedures Managing Reports Related Reference Report tab Reports Toolbar Functions
622
Click Reports on the workflow bar. Select a report in the Reports tree. Click the Parameters tab. If the report has parameters defined for it, the parameters will be listed there. Click Edit Parameters. The Edit Parameters dialog box displays. Edit the Label or Value of the listed parameters as required. From the Usage field, select the usage type of the parameter (constant value, start time, end time). Click OK to save your changes.
Related Concepts Report Generation Related Procedures Creating New Reports Creating Reports Generating Reports Managing Reports Related Reference Report Parameters tab
623
Click Reports on the workflow bar. Select the report in the Reports tree. On the Properties tab, click Edit. The Edit Report dialog box displays. Modify the Name and Description of the report as required. Ensure that the Share this report with other users check box is checked if you intend to have this report shared with other users. From the Default tab list box, select the tab that you want to be directed to when you select this report from one of the context-sensitive report lists. Specify one of the following options to indicate how the report can be edited:
Simple report: You can modify the Selection criteriathus changing the results of the selected report
or you can click Advanced Query to modify the SQL query code.
Advanced report: If you have familiarity with SQL, you may edit the query code in the Report data
query field. To assist you in editing SQL queries, a list box of function placeholders (for example, variables) is available. To insert one of the available pre-defined functions, select the corresponding placeholder from the Insert placeholder list box. Note: If you manually edit the SQL code for the query, upon finishing, click Check SQL to confirm your work.
Related Concepts Report Generation Related Procedures Creating New Reports Customizing BIRT Report Templates Creating Reports Generating Reports Managing Reports Related Reference Report Properties tab
624
Printing Charts
To print the current chart
1 2 3 4 5
Click Reports on the workflow bar. Select a report in the Reports tree. Click the Chart tab. Click Print at the top of the Chart tab. The chart data then displays in a new window in printable format. Your systems print dialog box is also displayed. Configure print settings as necessary and click OK to print the chart.
Related Concepts Report Generation Related Procedures Displaying Charts Creating Reports Generating Reports Managing Reports Related Reference Report Chart tab
625
Removing Charts
Removing a chart only removes the currently selected chart template from the selected report, it does not remove the chart template entirely.
Click Reports on the workflow bar. Select a report in the Reports tree. Click the Chart tab. Click the charts Remove chart type button. On the Remove Chart dialog box, do the following:
Select Remove user settings (and revert to report standard) to have the current users chart settings
deleted along with the chart. The chart will subsequently be displayed according to the reports standard settings. If no standard settings have been defined, the chart cannot be displayed. Note that this option is only available when the current user has defined specific chart settings.
Select Remove standard chart settings of report to have any standard settings deleted along with the
chart. User-specific settings are not affected by this option. Note that this option is only available when standard chart settings have been defined for a report.
Click OK to delete the chart template. If required, you can click the <Click here to choose a chart type> link to assign a new chart template to the selected report.
Related Concepts Report Generation Related Procedures Displaying Charts Creating Reports Generating Reports Managing Reports Related Reference Report Chart tab
626
627
Applying Filters
After you have created and stored a custom filter, you can apply that filter to the selected tree. Custom filters can be applied for requirements, test definitions and execution definitions. Only elements that meet applied filter criteria are displayed in the tree. Note: Filtered requirements are returned in read-only form and can not be edited. The Edit Properties button is disabled for filtered requirements.
Click the appropriate button (Execution, Requirements, or Test Plan) on the Workflow Bar. Select the desired filter from the Filter list box on the toolbar. All elements that meet the filters criteria will then be displayed. Note: To remove filtering and display all elements, select <No Filter> from the Filter list box on the toolbar.
628
Click the appropriate button (Execution, Requirements, or Test Plan) on the workflow bar. Create a new custom filter. After you have defined your first filtering rule, click Advanced to open the Edit Filter dialog box.
Enter a name for the filter in the Name field. 5 Give the filter a meaningful Description.
6 7 8 9
Click More to display a second set of filter-parameter fields with which you can define a second set of filter parameters. Select a logical operator for the application of the filtering queries. For example, filtered elements must meet both sets of criteria (and), or filtered elements must meet one, but not both, of the criteria sets (or). To delete a filter-parameter string, click the corresponding Delete button. To display additional filter-parameter fields and create additional filter queries, click More. To remove excess filter-parameter sets, click Fewer.
Related Concepts Filtering Related Procedures Creating Filters Creating Global Filters Working with Filters
629
Creating Filters
To create a new custom filter:
1 2 3 4
Click the appropriate button (Execution, Requirements, or Test Plan) on the workflow bar. Click New Filter on the toolbar to display the New Filter dialog box. From the Property list box, select the property on which you wish to base the new filter (for example, Name, Description, Priority, Version and Build). From the Operator list box, select a logical operator to be applied to the specified property (for example, =, not, >, >=, <, <=, contains, anddoes not contains). Note: The contents of the Operator and Value list boxes vary based on the attribute selected in the Property field.
In the Value field, enter the value that the specified property is to be compared against. Note: For date-based properties, the Value field is replaced with a calendar tool that you can use to select a specific date.
6 7 8 9
Click Save and apply to open the Edit Filter dialog box. To apply the filter to the current view without saving the filter settings, click Apply. On the Edit Filter dialog box, enter a name for the filter in the Name field. Enter a meaningful description for the filter in the Filter field. Click OK to save the filter with your project.
Related Concepts Filtering Related Procedures Creating Advanced Filters Creating Global Filters Working with Filters
630
Deleting Filters
To delete an existing custom filter
1 2 3
Click the appropriate button (Execution, Requirements, or Test Plan) on the workflow bar. Select the filter from the list box on the toolbar. Click Delete Filter.
Related Concepts Filtering Related Procedures Working with Filters Deleting Global Filters
631
Editing Filters
Existing filters are edited using the Edit Filter dialog box. The Edit Filter dialog box can be accessed both directly from the Toolbar (by clicking Edit Filter) and by clicking the Settings link on the menu tree).
Settings.
Click the name of the filter you want to edit. Edit the filter by changing the selections defined for the filter.
Note: To remove custom filtering, click the appropriate button (Execution, Requirements, or Test Plan) on the workflow bar and select <No Filter> from the Filter list box on the toolbar. Related Concepts Filtering Related Procedures Working with Filters Editing Global Filters
632
633
Click Execution on the workflow bar. Select an execution definition from the navigation tree. Select the Deployment tab. Click Edit in the Code Analysis Settings section of the tab. The Edit Code Analysis Settings dialog box displays. Check the Enable code analysis check box. In the Hostnames text box, enter a comma-separated list of hostnames (with port, if default port 19129 is not used) where code analysis information is to be gathered (for example, labmachine1, labmachine2:8000, 198.68.0.1). For each execution definition, you need to define the host names of the machine resources where the AUT is running. For example, with a client/server system, you must not only gather code coverage information on the client (which probably runs directly on an execution server) you must also gather data from the server (which likely runs on a different machine). This applies to all multi-tiered applications. Note: For JUnit code analysis runs, you do not need to specify a hostname.
Click OK to save your settings. Note: Once code analysis has been defined for an execution definition, each future run of that execution definition will gather code coverage information from the defined hostnames. While monitoring an execution from Test Managers Activities page, you will see that after gathering the sources for test definitions, Test Manager gathers full code coverage information before beginning test runs. The Code Coverage Controller, which is integrated into each Test Manager execution server, controls all defined hosts during execution runs. For each test definition of an execution definition, the controller starts and stops all associated instances, collects XML-based code coverage files for the test definition, and merges the results into a single file. The test definition then saves the merged code coverage file to its execution results.
Related Concepts Test Manager Code Analysis Code Coverage Analysis Related Procedures Analyzing Test Results - Quick Start Task Analyzing Code Coverage Related Reference Execution Deployment tab Code Analysis Unit Interface
634
Click Projects on the workflow bar. Select the project for which you want to analyze code-coverage data. Click Code Analysis on the workflow bar. Click Create Code Change Impact Report on the main toolbar. The Select Classes for Report dialog box displays, select a Product and Version, if you want to change the pre-selected values. In the Filter field, enter criteria to filter the packages. For example, entering the string published will only list packages that contain the string published in their names. Select a package from the Packages pick list. You can select multiple packages by holding down the CTRL key while clicking listed packages. The classes that are available in the selected package appear in the Classes pick list. Select a class file that you want to have included as a source in your report. You can select multiple classes by holding down the CTRL key while clicking listed classes. Click Add to add the class file(s) to the Selected classes pick list. You can remove classes in the Selected classes pick list by selecting entries and clicking Remove. Click Remove All to remove all selected classes from the Selected classes pick list.
6 7
Repeat the preceding steps Select a package from the Packages pick list through Click Add to add the class file(s) to the Selected classes pick list until you have added all required classes to the Selected classes list. Select a report from the Select report list box.
Related Concepts Code-Change Impact Reports Report Generation Test Manager Code Analysis Code Coverage Analysis Related Procedures Analyzing Test Results - Quick Start Task Enabling Code Analysis for Execution Definitions Analyzing Code Coverage Related Reference Code Analysis Unit Interface
635
Click Projects on the workflow bar. Select the project for which you want to view code-coverage information. Click Code Analysis to go to the Code Analysis unit. Expand the project node in the navigation tree to display the products that are available for the selected project. Expand a product node to display the versions that are available for that product. Expand a version node to display the builds that are available for that version. Select a specific build. Code coverage information for that build then displays on the Details tab. Note: To view code-analysis information for all products, including the products that have been created for the selected product, click Show all products on the main toolbar. Products of other projects are then listed under the Other Projects node.
Related Concepts Latest Builds and Build Versions Test Manager Code Analysis Code Coverage Analysis Related Procedures Analyzing Test Results - Quick Start Task Enabling Code Analysis for Execution Definitions Analyzing Code Coverage Related Reference Code Analysis Details tab
636
On the Edit Code Analysis Settings dialog box, proceed with enabling code analysis for the execution definition. Note: After code analysis is enabled, you can execute your test definitions in the Manual Testing Client. However, you need to click Code Analysis: Start on the Execute Test dialog box before you actually start testing. This way Test Manager will collect code analysis information while you execute the manual test. When you are done testing, click Stop to halt the collection of code analysis information.
Related Concepts Test Manager Code Analysis Manual Test Definitions Code Coverage Analysis Related Procedures Analyzing Test Results - Quick Start Task Enabling Code Analysis for Execution Definitions Executing Manual Tests Analyzing Code Coverage Related Reference Execution Deployment tab Code Analysis Unit Interface
637
638
639
Reference
This section contains all of the reference topics provided with SilkCentral Test Manager. In This Section User Interface Reference This section contains information about Test Manager's user interface elements. General Reference This section contains general reference topics provided with SilkCentral Test Manager. APIs Refer to the Test Manager API Help for full details regarding Test Manager's APIs. Database Schemas Refer to the Test Manager Database Model for full details regarding Test Manager's database schema.
640
641
642
Projects tab
Test Manager Projects Projects The Projects tab lists all the projects associated with your Test Manager installation and vital statistics for each project, including Description and Created By/Created On details. The Projects tab enables you to select the projects that you want to use. Note: Only your system administrator has rights to set up new projects. Project Description Created On Created By Name of the project. Project description. Date the project was created User who created the project.
643
Overview tab
Test Manager Projects Overview The Overview tab displays the Project Overview Report, which offers a high-level overview of the status of the selected project. The Project Overview Report shows the following information: General Report Information General information like the name of the current project, the report description, and the planned release date. Requirements Test coverage status for the Requirements unit. Shows the percentage of test coverage in tabular and graph format for the following: Total All requirements. High priority Requirements with high priority. Test-type distribution and test-execution results in chart and tabellary form for the Test Plan unit. The following numbers are shown in tabular and graph format for all issues that are linked to test definitions in the test plan tree: Find Number of found issues. The number of found issues in a period is the number of all fixed, all opened, and all deferred issues. Fixed Number of issues that are fixed. An issue is counted as fixed in a period each time it's status changes from "open" or "deferred" to "fixed", "verified", or "closed". Issues that are defined as "no longer an issue" are not counted as fixed. If an issue is fixed, opened again, and fixed again in the same period, it is counted twice as fixed. Deferred Number of issues that are defined not to be fixed in the current release, but in a future release. Open Backlog All issues that are currently open. If an issue is opened, fixed, and opened again in the same period, it is counted twice as opened.
Related Concepts Project Overview Report Code Coverage Analysis Report Generation Related Procedures Analyzing Code Coverage Managing Reports
644
Activities Page
Test Manager Projects Activities The Activities page offers a centralized location from which you can manage upcoming, current, and recently executed test runs on a per-project basis. The grid views on the Activities page offer view settings (resizing and reordering of columns), filtering, sorting, and grouping options that are configurable on a per-user basis. You can display/hide columns, adjust the width of columns, and move columns around using drag-and-drop. The Activities page is split into three sections: Next Executions, Current Executions, and Last Executions. The grid views can be resized by dragging and dropping the separators between the views. Context-sensitive menu commands are available for each test run. These commands enable you to link directly to listed execution definitions, continue manual tests, manage test-run results, and more. The Activities page makes it easier to identify match points between execution definitions and to find specific execution-definition information. Standard Windows keyboard shortcuts can be used to select test run entries, making it easy to select and manipulate specific sets of execution definitions and test results. Sorting, grouping, and filtering functions are available through context-menu commands to help you better organize and group test runs. All of your view-customization preferences are saved along with your project and will be available to you each time you visit the Activities page. Note: Data on the Activities page is not automatically refreshed. Click Reload near the paging buttons at the bottom of each view to refresh the entire page's contents. Note: You can use your keyboard's CTRL and SHIFT keys to select multiple queued executions and abort them all with one click.
Item Description
Next Executions
To enhance performance when you have numerous execution definitions, only the upcoming 50 execution definitions that are scheduled to run are displayed in the Next Executions view (future execution definitions can however be accessed using the available filtering features). To edit an execution definition listed in the Next Executions section, right-click the execution definition and choose Go to Execution Definition or click on the name of the execution definition; this takes you to the Executions unit where you can view and edit the details of the execution definition. By default, all execution definitions are sorted by Start Time. Columns in the Next Executions view can not be sorted or grouped. Next Executions view can be collapsed/expanded by clicking the double-arrow button on the right-hand side of the view's title bar.
ID of the scheduled execution definition. This column is hidden by default. Execution Definition/Folder Name of the scheduled execution definition or folder. Keywords Keywords that are assigned to the scheduled execution definition. Manual Testers (manual tests only) The user names of the people who are assigned to perform the manual test. This field is blank when no manual testers are assigned to the test. Priority The priority that has been assigned to the execution definition. Start Time Scheduled start time of the test run. Current Executions The Current Executions view lists the execution definitions that are currently running (both automated and manual test runs).
ID
645
To abort an execution definition that is currently in progress, click Abort in the Actions column of the execution definition. Right-click an execution definition and choose Go to Execution Definition or click on the name of the execution definition to view or edit the execution definition. Right-click an automated execution definition and choose View Details (or click the execution definition's Run ID/Task ID link) to view the execution's progress. As long as a manual test remains open, the corresponding execution definition remains in the list of Current Executions with a status of Pending. Right-click a manual test execution definition and choose Continue Manual Test (or click the execution definition's Continue Manual Test button in the Actions column) to continue a manual test in Manual Test Execution view. Right-click a manual execution definition and choose View Details (or click the execution definition's Run ID/Task ID link) to go to the Results for Execution Definition page for that execution definition. From there, you can click the name of a manual test definition in the Assigned Test Definitions portion of the dialog box to open the Results dialog box for the manual test definition; detailed results of the manual test definition are displayed there. Back on the Results for Execution Definition page, click Manual Test Results to go to Manual Test Execution view, where read-only information about the status of the assigned manual test definition is available. Page views of current executions are broken into views of 20 execution definitions each. You can advance through pages using First, Last, Next, and Previous located in the lower part of the Current Executions view. Or you can enter a page number into the Page text box and hit the ENTER key. ID Execution Definition Run ID/Task ID ID of the current execution definition. This column is hidden by default. Name of the current execution definition. Manual-test executions receive a run ID when they are executed. Upon test completion, this run ID carries over to Last Executions view. Automated-test executions receive a task ID when they are executed. Task IDs are however not carried over to Last Executions view. Completed automated tests receive a run ID in Last Executions view. Status of the active execution definition or manual test. For automated tests, status is indicated with a textbased value (Pending or Active). For manual tests, status is indicated with a colored histogram.
Status
Automated-test statuses are described textually and can be filtered. Manual tests can be filtered by checking relevant properties on the Filter submenu (Pending manual execution, Pending manual setup execution and Pending manual cleanup execution). Keywords Keywords that are assigned to the current execution definition. Executed By (manual tests only) The users who are assigned to perform the manual test. This field is blank when no manual testers are assigned to the test. Priority Priority that has been assigned to the current execution definition or manual test (Low, Normal, or High).
646
Time when the execution definition or manual test was executed. Amount of time remaining until the test is complete. For manual tests that do not have an estimated time, this column has a value of unknown. Shows how the test run was started. Manually, through a Web Service, or from a schedule. Name of the schedule, tester, or Web Service user. The scope specified in the Run dialog box. Actions that you can perform on the execution definition. Abort
Last Executions
Click to cancel the current execution. Alternatively, use the DELETE key on your keybord to abort test runs. When you abort executions, these executions are grayed out until the background process completes the deletion. Continue Manual Test Click to go to Manual Test view and execute the test. Manual Testing Client Click to open the Manual Testing Client. This button is available to users that are assigned as testers to the selected test. The Last Executions view lists all past execution definition runs, except deleted runs, for which results were collected from the execution server. You can filter the listed execution definition runs, for example by the start time. To view or edit an execution definition, right-click the execution definition and choose Go to Execution Definition, or click on the name of the execution definition. Right-click an execution definition run and choose View Details, or click the execution definition's Run ID link, to display the run's Results for Execution Definition page. This page shows details for the selected execution definition run and includes any files and messages, for example LiveLink VMware configuration captures, that were generated during the execution. Click on the Run ID of a test definition in the Assigned Test Definitions portion of the Results for Execution Definition page to access the test definition's Results dialog box. To compare two execution definition runs, use your keyboard's CTRL and SHIFT keys to select the two runs. Right click on your selection and click Reports Execution Definition Run Comparison .... For execution definitions that are deployed to virtual servers: To open VMware Lab Manager and restore a captured LiveLink configuration, expand the Messages link on an execution definition run's Results for Execution Definition page and select LiveLink. To delete a test run, right-click a run entry and choose Delete Results (or click the run's Delete button in the Actions column). Test-result page views are broken into views of 20 test results each. You can advance through pages using First, Last, Next, and Previous at the bottom of the Last
647
Executions view. Or you can enter a page number into the Page text box and hit the ENTER key. Last Executions view can be collapsed/expanded by clicking the double-arrow button on the right-hand side of the view's title bar. ID ID number assigned to the executed execution definition. Note that unassigned test definitions have an ID value of N/A. This column is hidden by default. Execution Definition Name of the executed execution definition. Click on the name to view or edit the execution definition. Run ID ID assigned to the test run. Click the link to view details of the test run. Status Result status of the test run (Passed, Failed, or Not Executed). Note that filtering, sorting, and grouping are not available for the Status column in Last Executions view. Keywords Keywords that were assigned to the execution definition at execution time. Executed By The information given in this column responds to the type of the test.
Test Type Column Description
Manual Test The names of the testers who executed the test. Automated Test The name of the execution server that ran the test. Note that sorting and grouping are not available on the Executed By column in the Last Executions view. This view is sorted by Start Time. When the test run began. The duration of the test run. The product under test. This column is hidden by default. The version of the product under test. This column is hidden by default. The build number of the product under test. Actions that you can perform on the execution definition. Click to delete the test run results. When you delete executions, these executions are grayed out until the background process completes the deletion. Alternatively, use the DELETE key on your keybord to delete test definitions. View Manual Test Results Click to view the Current Run page in read-only mode; or right-click a manual execution definition run and click View Manual Test Results. Shows how the test run was started. Manually, through a Web Service, or from a schedule. Name of the schedule, tester, or Web Service user. The scope specified in the Run dialog box. Delete
648
Related Concepts Project Management Test Definition Execution VMware Lab Manager Virtual Configurations Execution Definition Run Comparison Reports Related Procedures Managing Activities Managing Projects Executing Test Definitions Assigning Keywords to Execution Definitions Related Reference Cross-Project Activities Page Test Definition Run Results Dialog Execution Definition Run Results Dialog
649
Project ID The ID of the project the execution definition belongs to. Related Concepts Project Management Test Definition Execution VMware Lab Manager Virtual Configurations Related Procedures Managing Activities Managing Projects Executing Test Definitions Assigning Keywords to Execution Definitions Related Reference Activities Page Test Definition Run Results Dialog
650
Run ID <assigned test definition> <execution definition in Last Executions> <assigned test
Description
Details
Shows the details of the test definition run, including its Duration, Execution Path, the Execution Definition Run ID of the execution definition run that included the test definition run, and any Warnings/Errors. This tab also allows you to change the status of the test definition run. This option is useful if you need to manually overrule the status of a test run. Check the Hide Passed check box below the Assigned Test Definitions in the Execution Definition Run Results Dialog to show all test definitions. The default setting shows only the not passed test definitions to enhance performance. Only a part of the total test definitions have to be displayed. Additionally the information presented is of more use to the viewer. All parent nodes are displayed with the full status information. When a manual status change is performed, the details of the change are reflected in this tab's Status, Status Changed On, Status Changed By, Previous Status, and Status Change Comment fields. Only displayed for SilkTest, SilkPerformer, and manual test definitions. This tab includes details that are specific to the selected test definition type. For example, when a SilkTest test definition is selected, this view includes the selected test case, test data, and any warnings that were displayed during the test run. Lists all files that were generated by this test run, along with file sizes. The names of SilkTest .rex files act as download links. Once downloaded, these files can be viewed directly in a text editor. The upper table lists files that are associated with the test definition, such as result files or manually uploaded files for manual test definitions. The lower table lists files that are associated with the execution definition, for example execution log files or code analysis results. This tab also contains a button to download all result files: Download All Files Download all result files generated by the test definition run as a zipped package. Lists all messages that were generated by this test run, along with the severity of the messages.
Specific
Files
Messages
Messages that are associated with an execution definition as a whole, and not to one of the individual test definitions, can be viewed in the Projects unit (Activities tab/ Messages tab). Success Conditions Only displayed for automated test definitions. This tab shows all the success conditions that were defined for the test during the test planning process (Test Plan unit, Properties tab) and the result values from the execution run. Success conditions are used to determine if a test is successful or if it has failed. Data Driven Only displayed for data-driven test definitions using the option of having a single test definition for all data rows of the data set. This tab lists the status of each instance (data
651
Attributes Parameters
row) run of the test definition. Clicking an instance brings up another instance of the Test Definition Run Results dialog with run details of the selected instance. Any attributes that have been configured for the test definition. Any parameters that have been configured for the test definition.
The following table lists the UI elements that are used to step through the test definition results of an execution run. These elements are only visible when accessing the Test Definition Run Results dialog from an execution definition.
Item Description
Skip Passed
Used to determine which test definition run results should be displayed when browsing using the Previous Result and Next Result buttons. Checking this option only displays test definitions with a status other than Passed. < Previous Result Jumps to the result details of the previous test definition in the selected execution definition run. Next Result > Jumps to the result details of the next test definition in the selected execution definition run. Related Concepts Execution Dependency Configuration Related Procedures Viewing Test Execution Details Configuring Execution Dependencies Related Reference Activities Page Execution Runs Tab Test Plan Runs tab Execution Definition Run Results Dialog
652
653
Build information files contain project information, including build number, build log location, error log location, and build location. Enter the name of the active projects build information file in this field. All test executions will read the build information from this specified file. Project release date Scheduled release date of the active project (MM/DD/YYYY). File extensions to ignore in results Result file types or other file types that should not be saved as results for test executions. Related Concepts Settings Configuration Related Procedures Configuring Project Settings
654
Filters tab
Test Manager Settings Filters The Filters tab lists the filters that are available to the active project.
Item Description
Name of the filter. Filter category (requirement, test plan, or execution). When the filter was created. User who created the filter. When the filter was most recently modified. User who most recently modified the filter. Actions that can be performed on the filter (Delete).
Related Concepts Global Filters Related Procedures Configuring Global Filters Settings Configuration
655
Attributes tab
Test Manager Settings Attributes The Attributes tab lists the attributes that have been created for the current project.
Item Description
Name
Name of the attribute. This name is displayed in the following list boxes: Filters: Attributes can be used in global filters for filtering by test definition attributes (see Global Filters). Test Plan unit: Attributes can be applied to test definitions. (see Understanding Test Definition Attributes ). Attribute type. The following attribute types are available: Edit: An attribute of type edit is an alphanumeric field, in which you can enter any string (e.g., Comment). Normal: Attributes of type normal require you to define a list of values. When applying an attribute of type normal to a test definition, you can only select one value from the list (e.g., Priority, with defined values high, medium, and low).
Type
Set: Attributes of type set require you to define a list of values. When applying an attribute of type set to a test definition, you can select zero, one, or multiple values from the list (e.g., Test Scenario, with defined values load test, regression test, smoke test). Status Status of the attribute (Active or Inactive) Column The column name of the attribute in the LQM Reporting table. Use this column name to query the selected attribute within the LQM Reporting table. See the database model documentation for detailed information. Created On When the attribute was created. Created By User who created the attribute. Changed On When the attribute was last modified. Changed By User who most recently modified the attribute. Actions Available actions that can be performed on the attribute (Delete). Related Concepts Attributes Related Procedures Configuring Custom Attributes
656
Name of the custom requirement property. Property type. The following types are available: String, Integer, Boolean, and Date. Status of the property (Active or Inactive). When the property was created. User who created the property. When the property was last modified. User who last modified the property. Available actions that can be performed on the property (Delete).
Related Concepts Custom Requirement Properties External Requirements Management Tools Related Procedures Customizing Requirement Properties Integrating External RM Tools Requirements Management
657
Name Name of the custom step property. Actions The actions that can be performed on the property are Delete, Move Up, and Move Down. Related Concepts Custom Step Properties Manual Test Definitions Related Procedures Configuring Custom Step Properties Executing Manual Tests
658
Notifications Page
Test Manager Settings Notification The Notifications page lists the notification event types that have been configured for the active project.
Item Description
Notification Events Name of notification event that has been set up for the active project. Status Status of the notification event (Active or Inactive). When a notification event is activated, a notification email is sent to the user that activated the event, the first time one of the specified settings is changed. See Change-Notification Emails for a list of the changes that trigger a notification email. Note: The user must have specified an email address to be able to receive email notifications. If you want to specify an email address for a user, refer to the SilkCentral Administration Module documentation for help. Related Concepts Change Notification Change-Notification Emails Related Procedures Configuring Change Notification
659
This section lists details related to the integration of the Borland CaliberRM requirements management system. Note that if integration has not been enabled, you will only see the Status property. Status Status of integration (Enabled or Disabled). Hostname Machine where the external server is installed Username Credential for the requirements management server. Password Credential for the requirements management server. Project External project with which the Test Manager project is integrated. Requirement Types Requirement types within the project that are integrated. Create Requirements Indicates whether or not the Enable creation of unassigned requirements option is active. Enables creation and editing of unmapped requirements in Test Manager projects that are configured for integration with CaliberRM. Upload Requirements Indicates whether or not the Enable upload of requirements to Borland CaliberRM option is active. Enables the upload of unmapped/unassigned requirements from Test Manager to CaliberRM. This allows you to upload additional previously unmapped requirement trees to CaliberRM and then have those requirements mapped within Test Manager. When this option is enabled, the Map Requirement button becomes enabled (Requirements Properties), enabling configuration of top level requirements for external requirement types, which is required when uploading unmapped requirements. Property Mappings Lists any external/internal property mappings that have been defined between the internal and external requirements management systems. IBM Rational RequisitePro Integration This section lists details related to the integration of the IBM Rational RequisitePro requirements management system. Note that if integration has not been enabled, you will only see the Status property. Status Status of integration (Enabled or Disabled). UNC Project Path Machine where the external server is installed UNC Username Credential for the requirements management server. UNC Password Credential for the requirements management server. User name Credential for the requirements management server. Password Credential for the requirements management server. Packages The requirement packages from the external project that are integrated with the Test Manager project Requirement Types Requirement types within the packages that are integrated. Create Requirements Indicates whether or not the Enable creation of unassigned requirements option is active. Enables creation and editing of unmapped requirements in Test Manager projects that are configured for integration with Rational RequisitePro. Upload Requirements Indicates whether or not the Enable upload of requirements to RequisitePro option is active. Enables the upload of unmapped/
660
Property Mappings
Username Password DOORS Installation Path Project Name Requirement Types Schedule Create Requirements
Upload Requirements
Property Mappings
unassigned requirements from Test Manager to RequisitePro. This allows you to upload additional previously unmapped requirement trees to RequisitePro and then have those requirements mapped within Test Manager. When this option is enabled, the Map Requirement button becomes enabled (Requirements Properties), enabling configuration of top level requirements for external requirement types, which is required when uploading unmapped requirements. Lists any external/internal property mappings that have been defined between the internal and external requirements management systems. This section lists details related to the integration of the Telelogic DOORS requirements management system. Note that if integration has not been enabled, you will only see the Status property. Status of integration (Enabled or Disabled). The URL of Test Manager's Telelogic DOORS requirement Web Service. The default value should point to the correct location already. Credential for the requirements management server. Credential for the requirements management server. Client installation path within the front-end server directory structure. External project with which the Test Manager project is synchronized. Requirement types within the project that are synchronized. Any defined synchronization schedule. Indicates whether or not the Enable creation of unassigned requirements option is active. Enables creation and editing of unmapped requirements in Test Manager projects that are configured for integration with DOORS. Indicates whether or not the Enable upload of requirements to Telelogic DOORS option is active. Enables the upload of unmapped/unassigned requirements from Test Manager to DOORS. This allows you to upload additional previously unmapped requirement trees to DOORS and then have those requirements mapped within Test Manager. When this option is enabled, the Map Requirement button becomes enabled (Requirements Properties), enabling configuration of top level requirements for external requirement types, which is required when uploading unmapped requirements. Lists any external/internal property mappings that have been defined between the internal and external requirements management systems.
Related Concepts Requirements Integration Configuration Related Procedures Integrating External RM Tools
661
The name of the data source as it displays in the SilkCentral GUI and in reports. Click the name of a data source to modify the data source settings. Data source type (CSV, JDBC, MS Excel). Date when the data source was created. User who created the data source. Date when the data source was last modified. User who last modified the data source. This column contains action icons which allow the user to perform the following actions on a data source: Delete
Deletes the data source permanently. Deletion is not allowed if a data source is already associated with test definitions. Download Downloads the data source to your local computer. Upload Replaces the currently uploaded data source with the newly uploaded data source. Synchronize Updates all test definitions which are associated with the data source with the latest data. New Data Source Click this button to create a new data source. Related Concepts Data Sources for Data-Driven Tests Related Procedures Configuring Data Sources for Data-Driven Tests
662
Name
The name of the profile as it displays in the SilkCentral GUI and in reports. Click the name to edit a profile. Type The external issue tracking system. See the related Issue Tracking Profiles topic for detailed information on available issue tracking system integrations. Login The login name with which SilkCentral connects to the issue tracking system. Repository Info Physical location of the issue tracking system (hostname or URL). Created On Date when the issue tracking profile was created. Created By The user who created the issue tracking profile. Actions This column contains action icons which allow the user to perform the following actions on an issue tracking profile: Edit Mapping Edit the mapping of issue states between Test Manager and the external issue tracking system. Delete Deletes the issue tracking profile permanently. Profiles can not be deleted if external issues have been entered for test definitions. Click this button to create a new issue tracking profile.
New Profile
Related Concepts Issue Tracking Profiles Related Procedures Configuring Issue Tracking Profiles Managing SilkCentral Issue Manager Issue Tracking Profiles Managing Borland StarTeam Issue Tracking Profiles Managing IBM Rational ClearQuest Issue Tracking Profiles Managing Bugzilla Issue Tracking Profiles Mapping Issue States
663
Name
The name of the profile as it displays in the SilkCentral GUI and in reports. Click the name to edit a profile. Type The external source control system. See the related Source Control Profiles topic for detailed information on available source control system integrations. Working Folder Local or mapped working folder to which temporary sources are checked out to. Created On Date when the source control profile was created. Created By The user who created the source control profile. Changed On Date when the source control profile was last modified. Changed By The user who last modified the source control profile. Actions This column contains action icons which allow the user to perform the following actions on a source control profile: New Profile Delete Deletes a source control profile permanently. Click this button to create a new source control profile.
Related Concepts Source Control Profiles Related Procedures Configuring Source Control Profiles
664
665
666
Document View, Requirements View Toggles between Document View, which lets you view select properties of all requirements in a single view, and Requirements View, which enables you to drill deeply into the properties of a single requirement. New Requirement Enables you to add a new requirement to the active project. New Child Requirement Enables you to add a new requirement to the active project that will be a child of the selected requirement. Edit Enables you to open the selected requirement for editing. Delete Identifies the selected requirement in the tree as obsolete. The requirement will subsequently be displayed in italics if you right-click the project node and select the Show Obsolete Requirements command. Check the Destroy permanently check box on the Delete Requirement dialog box to permanently delete a requirement from the system. Cut, Copy, Paste Cut, copy, and paste of requirement elements between the Requirements Tree and the clipboard. Paste as Child Pastes a copy of the requirement held on the clipboard to the child level beneath the currently selected requirement. Move Up, Move Down Move requirements up or down within the Requirements tree view. Find/Replace Find enables you to search through all requirements in the active project based on configurable parameters. Replace enables you to optionally replace instances of found values with a new value. Filtering commands Requirements view filtering options. Show changes/Acknowledge changes Show recent changes and acknowledge changes to the Requirements tree. Show Direct/Full Coverage Toggles between full and direct coverage modes. Full Ccoverage mode offers a cumulative view of test-definition-to-requirement coverage that considers the status of all child requirements of parent requirements. In direct coverage mode. requirement status is calculated only by considering the test definitions that are assigned directly to requirements. Related Concepts Requirements Unit Interface Requirements Management Related Procedures Managing Requirements
667
Requirement Name Requirement ID Description Priority Risk Reviewed Custom Properties Document Created On Created By Changed On Changed By Related Concepts
Name of the requirement. Identifier of the requirement. Meaningful description of the requirement. Priority that has been configured for this requirement. Risk that has been configured for this requirement. Review status that has been configured for this requirement. Custom properties that have been configured for this requirement. Source document (if any) from which this requirement was derived. Date on which this requirement was created. Name of the user who created this requirement. Date on which this requirement was last updated. Name of the user who last updated this requirement.
Requirements Management Requirements Reports Related Procedures Creating Requirements Managing Requirements
668
Name of the attachment. Size of the attachment. Description of the attachment. When the attachment was created. User who created the attachment. Actions that can be performed on the attachment (Edit and Delete).
669
Assign Saved Selection Test Definition Status Last Execution Issues Actions Related Concepts
Click to assign a selection of test definitions from the Grid View. Name of the assigned test definition. Click to view and edit the test definition. Status of the assigned test definition. (Passed, Failed, Not Executed) When the test definition was last executed. Issues (if any) that are associated with this test definition. Actions that can be taken against this test definition (Delete and Locate).
Full Coverage and Direct Coverage Modes Test Plan Generation Related Procedures Assigning Test Definitions to Requirements Manually Assigning Test Definitions from Grid View to Requirements Removing Test Definition Assignments Locating Assigned Test Definitions in the Test Plan Tree Sorting the Assigned Test Definitions Tab Requirements Management
670
Related Concepts Requirements Management Requirements Document View Full Coverage and Direct Coverage Modes Requirements Reports Related Procedures Assigning Test Definitions to Requirements Manually Switching Between Full and Direct Coverage Modes
671
Revision number. Date the revision occurred. User who performed the revision. Auto-generated description of the nature of the revision (for example, deleted or created).
Note: When the History tab includes more elements than can be displayed at once without impacting response time, elements are displayed in increments. Page number links at the bottom of the tab allow you to browse through the elements included on the tab one page at a time. To display all elements as a single list, select the [All] link. Related Concepts Requirement History Related Procedures Tracking the History of a Requirement Viewing Recent Changes
672
673
Status
Displays the status of the last test execution (Passed, Failed, Not Executed, Not Scheduled). For test containers and folders, a number in brackets shows the total of test definitions within the respective container/folder. Last Execution Last execution related to the selected test definition or project. Last Build Build associated with the last execution. Changed On Last time the selected test definition or test plan was changed. Changed By User who last changed the selected test definition or test plan. Related Concepts Test Plan Management Related Procedures Managing Test Plans
674
675
Name that has been configured for the test definition. Database identifier of this test definition. Any description that has been configured for the test definition. Test Manager supports HTML formatting and cutting/pasting of HTML content for Description fields. Status Status that has been configured for the test definition. For test definitions that are part of a running execution definition, the status is updated in response to the current status of the test run. If the current run is aborted, the status is reset to the status before the run. Last Execution Last time this test definition was executed. For test definitions that are part of a running execution definition, the last execution is updated based on the current test run. Created On Date this test definition was created. Created By Name of the user who created this test definition. Changed On Date this test definition was last changed. Changed By Name of the user who last changed this test definition. Planned [hh:mm] Planned execution time of the test definition. This property is only displayed if a manual test definition is selected. Test Properties Test properties are specific to test type. Success Conditions Shows the names of all success conditions that have been configured for the test definition; whether or not each condition is active; the maximal value of each condition; and whether or not each condition is inherited. For test package nodes, all success conditions except the execution time-out are disabled and hidden. Integration Default Folder Shows the name of the default container or folder, where tests from external RMSs are created. Related Concepts Test Plan Management Test Definitions Success Conditions Related Procedures Editing Test Plan Elements
676
New Step Insert Step Edit Delete Cut Copy Paste Move Up Move Down Manage Attachments
Add a new test step to the end of this test definition. Insert a new test step into the sequence of this test definition. Edit the selected test step. Delete the selected test step from the Test Steps list. Cut the selected test step from the list and move it to the clipboard. Copy the selected test step to the clipboard. Paste a copy of the test step held on the clipboard to the row above the selected in the list. Move the selected test step one position up in the Test Steps list. Moves the selected test step one position down in the Test Steps list. Opens the Attachments dialog box, where you can perform the following actions: Upload File Attach Link Edit Delete Upload a file to the selected test step. Attach a link attachment to the selected test step. Edit the file or link attachment. Delete the file or link attachment.
The Step page shows all steps of the selected test in a table. The table has the following columns:
Column Description
Number of the step in the execution sequence. Name of the test step. Action you must perform to execute the test step. Expected result of the test step. Amount of files that are attached to the test step.
Note: When there are more than 200 steps in a test, the steps are displayed in multiple pages. Click on the page numbers to access the pages. To display all steps as a single list, click [All]. Related Concepts Test Plan Management Related Procedures Editing Manual Test Steps From Within Test Manager Managing Test Plans Related Reference Multi-Select Functionality for Test Plan Elements
677
Name Name of the child test plan element Changed On Date the child test plan element was last edited. Changed By User who last edited the child test plan element. Tip: As with test plan elements listed in the Test Plan tree, elements listed on the Contents tab can be right-clicked to access context-relevant commands through a context menu. Commands that are not available are grayed out. Before you can paste a test plan element into the Contents tab you must explicitly select an element within the tab to gain the application's focus. Note: When the Contents tab includes more elements than can be displayed at once without impacting response time, elements are displayed in increments. Page number links at the bottom of the tab allow you to browse through the elements included on the tab one page at a time. To display all elements as a single list, select the [All] link. Related Concepts Test Plan Tree Test Plan Management Related Procedures Copying, Pasting, and Deleting Test Plan Elements Managing Test Plans Related Reference Multi-Select Functionality for Test Plan Elements
678
Name of the attribute The attribute value that has been assigned Attribute type. Is the attribute inherited from a parent
Note: Inheritance of attributes is similar to inheritance of properties and success conditions. Attributes that are assigned to a parent node are inherited throughout all sub-folders and child test definitions. Related Concepts Test Plan Management Related Procedures Configuring Test Definition Attributes Creating Test Definitions
679
Name of the assigned parameter The selected parameter value for this test definition Parameter type (String, Number, Float, Boolean, Password, or Character) Indicates if the parameter has been inherited from a parent
Note: Test definition parameters that are contained within a property of a test definition (for example, testdata for SilkTest test definitions) are listed at the top of the Parameters tab. Unused parameters are appended to the bottom of the list and grayed out (analogous to a disabled state). Note: When the Parameters tab includes more elements than can be displayed at once without impacting response time, elements are displayed in increments. Page number links at the bottom of the tab allow you to browse through the elements included on the tab one page at a time. To display all elements as a single list, select the [All] link. Related Concepts Test Definition Parameters Test Plan Management Related Procedures Configuring Test Definition Parameters Managing Test Plans
680
Name of the assigned requirement. Click to open the Requirements Properties page. Priority of the requirement. Potential risk associated with the requirement. Review status of the requirement. Actions that can be performed on the selected requirement (Remove Requirement, Locate Requirement, and View Description). Available Requirements The Available Requirements tree lists all requirements that can be assigned to the selected test definition. Related Concepts Test Plan Management Related Procedures Assigning Requirements to Test Definitions Managing Test Plans
681
Name of the attachment. Size of the attached file. Description that has been defined for the attachment. When the attachment or link was uploaded. User who uploaded the attachment. Actions that can be taken on the attachment (Edit or Delete). Check this check box to additionally display all attachments of child test definitions, folders, and test containers of the selected node.
Related Concepts Attachments Test Plan Management Related Procedures Attaching Files to Test Plan Elements Working with Attachments
682
Execution Definition Assignment Type Last Execution Next Execution Related Concepts
Name of the assigned execution definition. Click to view or edit the execution definition. Execution definition type (manual or automated). Last time the test definition was executed as part of the assigned execution definition. Next scheduled execution of the test definition as part of the assigned execution definition.
Test Plan Management Test Definition Execution Related Procedures Manually Assigning Test Definitions to Execution Definitions Managing Test Plans Executing Test Definitions
683
Actions
Create a new issue for this test definition Click to open the New Issue dialog and create a new issue for the test definition. Run Type The Run Type column shows the test definition type during each run. The test type might change between two runs, for example when you convert the test from manual to automated. Run ID The ID of the test definition run. Click to open the Test Definition Run Results dialog box. If the test definition is running, click to view details of the execution. Start Time Time the run started. If the test is a manual test and currently running, Test Manager adds (Running) to the date and time. Execution Definition Name The name of the assigned execution definition, or unassigned tests if the execution was a try-run or results were uploaded. Click to open the execution definition. Status Status of the execution. For test definitions that are part of a running execution definition, the status is updated in response to the current status of the test run. If the current run is aborted, the status is reset to the status before the run. Issues Found Displays the amount of issues that are assigned to the test definition run. When no issues are assigned to the test definition run, the column is empty. Click on the link to access the issue in the Issues page of the Test Plan unit. Executed By The execution server from which the test was run. Errors Number of errors that were generated during the run. Warnings Number of warnings that were generated during the run. Version Version that the test was run against. Build Build number that the test was run against. Related Concepts Test Definition Run Comparison Report Related Procedures Test Definition Execution Related Reference Test Definition Run Results Dialog
684
Actions Actions that can be performed on the issue. Issue ID ID that has been automatically assigned to the issue. Assigned Test Definition Test definition that has been assigned to the issue. This column is only displayed if the currently selected object is a container or a folder. Synopsis Synopsis of the issue. Status Status of the issue. External ID Indicates if the issue is tracked by an external issue tracking system. Click an external issue number to link directly to the external issue tracking system. Test Definition Run The ID of the test definition run that the issue is assigned to. Click on the ID to access the Details page of the Test Definition Run Results dialog box in the Execution unit. Created On When the issue was created. Created By User who created the issue. New Issue Assign a new issue to the selected test definition. This button is only displayed if the currently selected object is a test definition. Assign External Issue Assign an issue from an external issue tracking system to the selected test definition. This button is only displayed if the currently selected object is a test definition. Related Concepts SilkCentral Issue Manager Related Procedures Creating New Issues Working with Issues
685
Revision number. Date the revision occurred. User who performed the revision. Auto-generated description of the nature of the revision (for example, deleted or created).
Note: When the History tab includes more elements than can be displayed at once without impacting response time, elements are displayed in increments. Page number links at the bottom of the tab allow you to browse through the elements included on the tab one page at a time. To display all elements as a single list, select the [All] link. Related Procedures Tracking Test Plan History Viewing Recent Changes
686
Filter element. This field typically has a value of Filter query. Filter value used to filter the contents of the configured data set. Indicates whether or not the filter was inherited from a parent test container or test definition. Actions that can be performed on the filter (Edit or Delete).
Note: When the Data Set tab includes more elements than can be displayed at once without impacting response time, elements are displayed in increments. Page number links at the bottom of the tab allow you to browse through the elements included on the tab one page at a time. To display all elements as a single list, select the [All] link. Related Concepts Data-Driven Tests Related Procedures Creating Test Definitions Working with Data-Driven Tests Managing Test Plans
687
Test Plan View, Document View Up New Test Container, New Test Folder, New Test Definition Edit, Delete Cut, Copy, Paste Move Up, Move Down Find/Replace
Toggles between Test Plan View and Document View. Navigates one level up in the hierarchy of the navigation tree, regardless of the current cursor focus. Enables creation of new test containers, test folders, and test definitions. Edit and deletion of test plan elements. Cut, copy, and paste of test plan elements. Move test plan elements up or down within the Test Plan tree view. Find enables you to search through all test plan elements in the active project based on configurable parameters. Replace enables you to optionally replace instances of found values with a new value. Test Plan View filtering options. Show recent changes and acknowledge changes to test plan.
Related Concepts Test Plan Management Test Plan Document View Related Procedures Managing Test Plans Test Definition Execution
688
Run ID <assigned test definition> <execution definition in Last Executions> <assigned test
Description
Details
Shows the details of the test definition run, including its Duration, Execution Path, the Execution Definition Run ID of the execution definition run that included the test definition run, and any Warnings/Errors. This tab also allows you to change the status of the test definition run. This option is useful if you need to manually overrule the status of a test run. Check the Hide Passed check box below the Assigned Test Definitions in the Execution Definition Run Results Dialog to show all test definitions. The default setting shows only the not passed test definitions to enhance performance. Only a part of the total test definitions have to be displayed. Additionally the information presented is of more use to the viewer. All parent nodes are displayed with the full status information. When a manual status change is performed, the details of the change are reflected in this tab's Status, Status Changed On, Status Changed By, Previous Status, and Status Change Comment fields. Only displayed for SilkTest, SilkPerformer, and manual test definitions. This tab includes details that are specific to the selected test definition type. For example, when a SilkTest test definition is selected, this view includes the selected test case, test data, and any warnings that were displayed during the test run. Lists all files that were generated by this test run, along with file sizes. The names of SilkTest .rex files act as download links. Once downloaded, these files can be viewed directly in a text editor. The upper table lists files that are associated with the test definition, such as result files or manually uploaded files for manual test definitions. The lower table lists files that are associated with the execution definition, for example execution log files or code analysis results. This tab also contains a button to download all result files: Download All Files Download all result files generated by the test definition run as a zipped package. Lists all messages that were generated by this test run, along with the severity of the messages.
Specific
Files
Messages
Messages that are associated with an execution definition as a whole, and not to one of the individual test definitions, can be viewed in the Projects unit (Activities tab/ Messages tab). Success Conditions Only displayed for automated test definitions. This tab shows all the success conditions that were defined for the test during the test planning process (Test Plan unit, Properties tab) and the result values from the execution run. Success conditions are used to determine if a test is successful or if it has failed. Data Driven Only displayed for data-driven test definitions using the option of having a single test definition for all data rows of the data set. This tab lists the status of each instance (data
689
Attributes Parameters
row) run of the test definition. Clicking an instance brings up another instance of the Test Definition Run Results dialog with run details of the selected instance. Any attributes that have been configured for the test definition. Any parameters that have been configured for the test definition.
The following table lists the UI elements that are used to step through the test definition results of an execution run. These elements are only visible when accessing the Test Definition Run Results dialog from an execution definition.
Item Description
Skip Passed
Used to determine which test definition run results should be displayed when browsing using the Previous Result and Next Result buttons. Checking this option only displays test definitions with a status other than Passed. < Previous Result Jumps to the result details of the previous test definition in the selected execution definition run. Next Result > Jumps to the result details of the next test definition in the selected execution definition run. Related Concepts Execution Dependency Configuration Related Procedures Viewing Test Execution Details Configuring Execution Dependencies Related Reference Activities Page Execution Runs Tab Test Plan Runs tab Execution Definition Run Results Dialog
690
691
Status Build Version Product Priority Last Execution Duration Next Execution Test Container
Status of the execution. Build that the execution was based on. Version that the execution was based on. Product that the execution was based on. Priority of the execution. Last time the execution occurred. Duration of the execution. Next scheduled execution. Test container containing the test definition that this execution is based on.
Related Concepts Test Definition Execution Related Procedures Analyzing Test Runs
692
Execution Definition Name Execution Definition ID Description Test Container Version Build Priority SilkTest AUT Host Name Test Definitions Source Control Label
Name of the execution definition. Database identifier of the execution definition. Meaningful description of the execution definition. Test container the execution definition is associated with. Click to access the test container in the Test Plan unit. Product version the execution definition is associated with. Product build the execution definition is associated with. Priority of the execution definition. Hostname of the application under test (for SilkTest tests only). Test definitions associated with this execution definition. In the Source Control Label field you can optionally specify that the execution definition be of an earlier version than the latest version. The label must refer to a version in the source control system that the test container is associated with. If this field is left blank, the latest version of the execution definition will be fetched. The Source Control Label property is only enabled if the associated test container uses a source control profile that supports versioning. Make sure to have enough free disk space on the execution server or servers when working with multiple versions of source files. Each version will be saved in its own folder on every execution server. Reflects whether or not the test definitions of the latest run of an execution definition Passed, Failed or were Not Executed. Last time this execution definition was executed. Length of time required for execution of an execution definition. In the simplest case (automated tests on a single execution server; or only manual tests) duration is the time displayed for the latest run on the Runs tab. If the last execution involved automated tests that were executed on more than one execution server, duration on the server on which execution lasted the longest is considered. If the last execution involved both automated and manual tests, only the automated or manual tests will be considered, depending which tests last the longest. This is because automated and manual tests are executed in parallel. If the execution definition contains multiple test definitions, the duration is measured from the time when the first test definition begins executing and ends when the last test definition completes execution. This includes the overhead time, which is needed for stopping/ starting test definitions between executions. If an execution definition contains only a single test definition, this overhead is not included in the duration. Next time this execution definition will be executed. Time the execution definition was created. User who created the execution definition. When the execution definition was last changed. User who last changed the execution definition.
693
Related Concepts Test Definition Execution Related Procedures Adding Execution Definitions Working with Execution Definitions Executing Test Definitions
694
Manual assignment
Click the Manual assignment option button to manually assign test definitions to the execution definition. Use test plan order Check to set the execution order of the assigned test definitions to follow the execution order in Test Manager's Test Plan unit. Assign Saved Selection Click to assign a selection of test definitions from the Grid View. Assignment by filter Click the Assignment by filter option button to automatically assign test definitions to the execution definition based on a pre-defined filter. The available filters are listed in the list box. Assigned Test Definitions List The following properties are shown for each assigned test definition: Order The execution order of the test definition. The Use test plan order check box must be unchecked to change the execution order. Click in the text box, type the new order of the test definition, and then press Enter to confirm the change. Each change in each of the text boxes must be confirmed by pressing Enter. If you change the orders of multiple test definitions without pressing Enter each time, just the last change before pressing Enter is taken. Test Definition Name of the test definition. Click on the name to access the test definition in the Test Plan unit. Status Status of the last run of the test definition in the context of the execution definition. When the test definition is executed outside of the context of the execution definition, the displayed status remains unchanged. If the test definition has not yet been executed in the context of the execution definition, the status is Not Scheduled. Last Execution Date and time of the last run of the test definition in the context of the execution definition. When the test definition is executed outside of the context of the execution definition, the displayed time and date remain unchanged. Actions The following actions can be performed on the assigned test definitions when the Manual assignment option button is clicked:
Remove Click to remove the selected test definition from the list. Locate Click to locate the selected test definition in the test plan tree. This window shows all test definitions in the test plan tree that are available for assignment to the selected execution definition. Use the arrows to assign the test definitions to the execution definition. For information about inserting multiple test
695
definitions from Test Manager's Test Plan unit to the execution definition, see Assign Test Definitions from Grid View to Execution Definitions. Related Concepts Test Definition Execution Related Procedures Assigning Test Definitions to Execution Definitions Working with Execution Definitions Executing Test Definitions Assign Test Definitions from Grid View to Execution Definitions
696
Name of the configured setup test definition. Opens the Edit Setup Test Definition dialog box where you can select a setup test definition. A test definition can not be simultaneously assigned to the same execution definition as both a setup test definition and a regular or cleanup test definition.
Assigned test definitions can come from any test container within your project. It is therefore possible to assign test definitions that have associated products and source control profiles that vary from their host execution definitions. Test Definition (Cleanup Test Definition) Name of the configured cleanup test definition. Edit (Cleanup Test Definition) Opens the Edit Cleanup Test Definition dialog box where you can select a cleanup test definition. A test definition can not be simultaneously assigned to the same execution definition as both a setup test definition and a regular or cleanup test definition. Assigned test definitions can come from any test container within your project. It is therefore possible to assign test definitions that have associated products and source control profiles that vary from their host execution definitions. Related Concepts Setup and Cleanup Test Definitions Test Definition Execution Related Procedures Configuring Setup and Cleanup Executions
697
None Global
Click this option button to not have a schedule defined for the execution definition. Click this option button to select a pre-defined schedule from the list box for the execution definition. Selecting a global schedule includes the schedule exclusions and definite runs which are defined in the global schedule. See the SilkCentral Administration Module Help for information on defining global schedules.
Selecting a global schedule displays the schedule details below the Custom option button. Click this option button to define a custom schedule for the execution definition. Click Edit to edit the custom schedule in the fields below. Schedule details area The bottom part of this page displays the schedule details of the selected global schedule or the custom schedule. If custom schedule is selected, the fields are editable. Custom From Specify when the execution schedule is to begin (Month, Day, Year, Hour, Minute). Click next to the specified date to access the calendar tool. Interval Specify the interval at which the execution's tests are to be executed (Day, Hour, Minute). Adjust schedule to daylight savings Check this check box to automatically have your schedule adjust to daylight savings time. Note that daylight adjustment only works for intervals of two-hour multiples to avoid duplicate runs when setting time back one hour. Run In the Run portion of the GUI, specify when the execution is to end: Forever Click this option button to specify that the execution is to have no end. Time(s) Click this option button and select a number from the list box to define a specific number of executions. Until Click this option button to pick a specific time at which test executions are to end. Click next to the specified date to access the calendar tool. A schedule exclusion is a regularly occurring time period during which executions should be suspended (for example, weekly planned system downtime, weekends). You can add as many schedule exclusions as are required. To define an exclusion, click the Add Exclusion button. Place check marks next to the days for which the exclusion should be in effect. Using the From list boxes, select the hour and minute when the exclusion should begin. Using the To list boxes, select the hour and minute when the exclusion should end. Click OK to save your changes, or click Cancel to abort.
Exclusion
698
Definite Runs
A definite run is an execution that you schedule to run independent of the configured schedule. You can add as many definite runs as are required. To add a definite run, click Add Definite Run. Click next to the specified Run at date to access the calendar tool and specify when the definite run is to take place. Click OK to save the definite run, or click Cancel to abort.
Warning: If test definitions assigned to an execution definition schedule are not executed, this might be caused by too many running tests. In such a case, test definitions that are already included in a schedule are not executed when triggered manually or by a schedule. Click the Application Server Log tab in Administration Reports to view the application server logfile. If there is a warning in the logfile that states that the schedule interval might be to short, increase the schedule interval. Related Concepts Execution Definition Schedules Related Procedures Creating a Custom Schedule for an Execution Definition Specifying Global Schedules for Execution Definitions Defining Execution Definition Schedules
699
Keywords
Lists the keywords that have been assigned to this execution definition.
Automated execution definitions Keywords are used to automatically identify an appropriate execution server for each test execution. Manual execution definitions Keywords are used by the manual tester to reflect the test environment. Edit Click to edit this execution definition's keywords. Matching execution servers Lists the active execution servers that have keyword lists that match the keywords list of this execution definition. All keywords in the keywords list of the execution definition must be included in the keyword list of the execution server. Click on the name of an execution server in the list to access the execution server list in Administration Locations. Capturing Options The following VMware LiveLink capturing options are available for VMware Lab Manager configurations: Note: Only VMware Lab Manager configurations are captured. LiveLink URLs are attached to execution definition results (as links on the Messages tab and as separate html files that contain the LiveLinks). Never Don't capture configurations. Immediately on error Once a failed test definition is completed, no further test definitions are executed and the configuration is captured. After completing all test definitions Upon failure conditions, continue test execution and capture the configuration after executing all tests of the execution definition. Manual Testers SilkTest AUT Hostname Code Analysis Settings Always Capture configuration with each run of the test execution Lists all manual testers who have been assigned to this execution definition or folder. Click Edit to edit the list of manual testers. Lists all SilkTest AUT hosts that have been defined for this execution definition. Click Edit to edit the list of SilkTest AUT hosts. Details code-analysis settings that have been defined for this execution definition. Click the Inactive link to enable code analysis for this execution definition. For virtual execution on VMware Lab Manager configurations, the internal IPs of the affected machines within the configuration must be configured here.
700
Related Concepts Execution Definitions VMware Lab Manager Virtual Configurations Test Definition Execution Related Procedures Assigning Keywords to Execution Definitions Analyzing Code Coverage Configuring Deployment Environments
701
Name of the master execution definition that the selected execution definition is dependent upon. The read-only Master Execution Definitions portion of the tab includes the Name of all master executions of the selected execution definition. The specific Condition of each master execution definition that triggers execution of the selected execution definition is also listed. Condition of the master execution definition that must be met for the selected execution definition to be triggered. Name of the dependent execution definition that the selected execution definition serves as the master of.
The Dependent Execution Definitions portion of the tab includes the Name of all execution definitions that are dependent on the selected execution definition. The specific Condition of the selected execution definition that triggers execution of each dependent execution definition is listed. The execution server where each dependent execution definition is to be executed is also listed. Condition (Dependent Execution Definitions) Condition of the selected execution definition that must be met for the dependent execution definition to be triggered. Execution Server / Manual Tester Execution server where the dependent execution definition is to be run (or, in the case of a manual test execution, manual tester who is to perform the manual test). Actions Actions that can be performed on the selected dependency (Edit or Delete). Related Concepts Execution Dependency Configuration Execution Definitions Related Procedures Configuring Execution Dependencies
702
Check to receive a notification email each time an execution run finishes successfully. Execution definition runs finishing with not passed test Check to receive a notification email each time an definitions. execution finishes with status not executed or failed. Execution definition runs finishing with changed number Check to receive a notification email each time the of not passed test definitions. number of failed or not executed tests changes in comparison to the previous run, when an execution finishes.
703
Actions
Actions that you can perform on the execution definition. Delete Run Results
Click to delete the results of this run. When you delete the results for selected runs, Test Manager removes the runs from the Runs page. The runs are grayed out until the background process completes the deletion. Alternatively, use the DELETE key on your keybord to delete the test run results. View Manual Test Results Click to view the Current Run page in read-only mode. Run ID Identifier of the execution definition run. Click to access the results of the run. Status Status summary of the run. A bar lists the amount of passed, failed, and not executed test definitions. The run status of each assigned test definition is shown in the second section. Keywords Keywords assigned to the execution definition. Executed By Name of the execution server on which the run was executed. For manual test definitions the name of the person who executed the run is listed. Errors Number of errors that occurred in the run. Warnings Number of warnings that occurred in the run. Product The application under test. Version Version of the product. This information can be set in Administration Configuration Products. Build Build number of the product version. This information can be set in Administration Configuration Products. Start Time Time the run started. Duration Duration of the test run in h/mm/ss. Start Type Shows how the test run was started. Manually, through a Web Service, or from a schedule. Starter Name Name of the schedule, tester, or Web Service user. Start Scope The scope specified in the Run dialog box. The test definition runs section lists the test definition runs for the selected execution definition run. The section is paged with fifty runs shown on a page. Use the arrow buttons to navigate through the pages. The following items are shown for each run:
Item Description
Actions
704
If the test definition, to which the run belongs, is of a test-definition type that generates result files, click on the icons to view or download the result files. Create a new issue for this test definition Click to open the New Issue dialog and create a new issue for the test definition. Run ID Identifier of the test definition run. Click to open the Test Definition Run Results dialog box. ID Identifier of the test definition. This column is hidden by default. Test Definition Name of the test definition. Click to access the test definition in the Test Plan unit. The icon corresponds to the test type. Start Time Date and time the run started. Status Status summary of the run. For a single test definition a single status is shown. A bar lists the amount of passed, failed, and not executed test definitions for a test package or suite node. Executed By Name of the execution server on which the run was executed. For manual test definitions the name of the person who executed the run is listed. Issues Found Displays the amount of issues that are assigned to the test definition run. When no issues are assigned to the test definition run, the column is empty. Click on the link to access the issue in the Issues page of the Test Plan unit. Errors Number of errors that occurred in the run. Warnings Number of warnings that occurred in the run. Related Concepts Test Definition Execution Execution Definition Run Results Dialog Execution Definition Run Comparison Reports Related Procedures Analyzing Test Runs Changing the Status of a Test Execution Run Deleting Individual Test Run Results Deleting the Results of an Execution Definition Related Reference Activities Page Test Definition Run Results Dialog
705
Reload Click to reload the Current Run page. Synchronize Run Click to update the tasks shown in the Current Run page. The following items are updated: Assigned test definitions Only when you start the run with scope Run all Tests and click Synchronize Run, test definitions that are newly assigned to the execution definition are shown, and test definitions that are no longer assigned to the execution definition are removed, if the test definitions are not already started. Test properties The name, the description, the attachments, and the other test properties of the assigned test definitions. Test step properties The name, the description, the attachments, and the other teststep properties of the steps in the assigned test definitions. Click this button to finish the test execution and to open the Finish Run dialog box. From the Build list box, choose the appropriate build. The build of the execution definition on which the manual test was started is preselected from the list. If there exist test definitions in the execution definition run with status Not Executed, you can choose an action to perform from the Status list box. The following elements are available in the dialog box:
Action Description
Finish Run
Select Build
Removes all test definitions with status Not Executed from the execution definition run. This action is only available if the run includes tests with status Not Executed. Sets all test definitions with status Not Executed to the selected status. This action is only available if the run includes tests with status Not Executed. Select the build number for the execution definition run from the list box.
The Execution Definition Run Details view displays the following information on the execution definition run, which has the active manual test run assigned:
The name of the execution definition. The keywords assigned to the execution definition run. The version of the execution definition's product. The build of the execution definition's product. The run type of the execution definition run. The start type of the execution definition run. The name of the tester, schedule, or Web Service user that started the execution definition run. The start scope of the execution definition run, specified in the Run dialog box. The start time of the execution definition run.
Note: You can show or hide the Execution Definition Run Details view by clicking on the arrows in the top-right corner. The view is collapsed by default. The Assigned Test Definitions view provides the following information for the manual test definition run:
Item Description
Actions
Actions that can be performed during the run. Some of these actions are not available for datadriven tests.
Action Description
Add Result File Add a result file to the test definition. New Issue Create a new issue for the test definition. Edit Test Definition Edit the test definition in the Edit Test Definition dialog box. When you close the dialog box, Test Manager automatically synchronizes the test run. # Order of the test definition in the execution definition run. ID Identifier of the test definition. This column is hidden by default. Test Definition Name of the test definition. Click on the name to view the test definition, or to perform an action on the test definition. Status Current status of the test. Click on the status to change it. Executed By Name of the user that has started the test run. The Test Steps view provides the following information for each test step:
Item Description
Actions
Edit the test step in the Edit Step dialog box. When you close the dialog box, Test Manager automatically synchronizes the test run. # Order of the step in the test. Step Name Name of the step. Click to access the step in Test Plan Steps. Status Execution status of the step. Click on the status to change it. Result Result of the step. Click on the text box to edit the result. Tip: You can hide the Test Steps view by clicking on the arrows in the top-right corner. The Test Definition Details view displays the following information for the active manual test definition:
Edit
707
The name of the test definition. The amount of attachments attached to the test definition. Click on the link to access the attachments. The amount of issues assigned to the test definition. Click on the link to access the issues. Result files generated for the test definition. Click on the links to access the result files. The description of the test definition.
The Step Details view displays the following information for the selected step:
The name of the step. A description of the action the step performs. The expected result of the step. The result of the step, when the test run is finished. The attachments to the step.
Tip: You can hide the Step Details view by clicking on the arrows in the top-right corner. If other execution definition runs are started while the Current Run page is open, a note displays, stating that newer runs are available. You can see information on those runs in the Activities page. For automated tests, the Current Run page shows the progress of the execution. Related Concepts Manual Test Definitions Test Definition Execution Calculating the Test Definition Status Related Procedures Executing Manual Tests Executing Manual Tests in the Current Run Page Related Reference Execution Unit Interface Execution Notifications Page
708
Run Dialog
Test Manager Execution The Run dialog box enables you to specify which test definitions you want to execute based on filter criteria and to specify which product build the test should be run against. To open the Run dialog box, select an execution definition or an execution folder and click Run on the toolbar.
Item Description
Select this option to execute all test definitions. Select this option to only execute test definitions that meet certain filter criteria. For example, Failed status, or test definitions that have not been executed since before a specified build number. Test definitions that have had issues fixed since their last Select this option to only execute those test definitions execution that have had issues advanced to the Fixed state since the test definition's last execution. Set build for execution definition Select a past build from the Set build for execution definition list box to have the test run against a specific past build. This field defaults to the current build. Note that this option is not available if the execution definition is configured to read the build number from a build information file. If an execution folder contains execution definitions with different product versions assigned to each, the build can not be selected for the execution of the execution folder. Run Type Choose Run as Specified to run all selected tests with their own test type, or choose Run automated tests manually to re-run all selected tests manually. Go to Activities page Check this check box to advance to the Activities page after you define test definitions for execution. Related Concepts Test Definition Execution Related Procedures Using the Manual Testing Client Executing Test Definitions Executing Individual Tests Executing Manual Tests Executing Manual Tests with the Manual Testing Client Executing Manual Tests in the Current Run Page
709
Details
710
Use the text formatting toolbar to formate the Test Definition Description, Step Description, and Expected Result descriptions. Click Parameters to insert parameters into your descriptions. Use the navigation toolbar to manage your manual test execution and to navigate between test definitions. The following buttons are included in the Execute Test Toolbars: Edit Go To Issues Edit the properties of the selected test definition. View the issues of the selected manual test definition in Test Manager, or assign new issues to the test definition. Internal Issue Add an internal issue to the selected test definition. Next Test Advance to the next test step in the manual test execution. Previous Test Return to the previous test step in the manual test execution. Finish Run Close the Execute Test dialog box when you have completed all test steps in the active manual test definition. Add Test Step Add a new test step to the end of the Test Steps list. Insert Test Step Insert a new test step above the selected test step in the Test Steps list. Duplicate Test Step Create a copy of the selected test step in the Test Steps list. Delete Test Step Delete the selected test step from the Test Steps list. Move Test Step Up Move the selected test step up one position in the Test Steps list. Move Test Step Down Move the selected test step down one position in the Test Steps list. Bold Apply bold formatting to the selected text. Italics Apply italicized formatting to the selected text. Underline Apply underlined formatting to the selected text. Align Left Align the selected text to the left side. Align Center Align the selected text to the center. Align Right Align the selected text to the right side. Justify Apply a justified alignment to the selected text. Bulleted List Convert the selected text to a bulleted list. Indent Left Apply a left-side indent to the selected text. Indent Right Apply a right-side indent to the selected text. Undo Change Undo the last action you performed in a text description text box. Redo Change Redo the last action you performed in a text description text box. Font Apply a different font type to the selected text. Font Size Apply a different font size to the selected text. Format Apply a different pre-defined formatting style to the selected text. For example, Heading 1, Heading 2, ... Parameters Insert preconfigured Test Manager custom step properties, which are also called project parameters, into text descriptions. In normal mode, the dialog box displays the parsed values of the resolved parameters. In Edit mode, the dialog box displays the actual parameters.
711
Related Concepts Manual Testing Client Test Definition Parameters Test Definitions in the Manual Testing Client Related Procedures Using the Manual Testing Client Editing Test Definitions Within the Manual Testing Client Adding an Internal Issue with the Manual Testing Client
712
Run ID <assigned test definition> <execution definition in Last Executions> <assigned test
Description
Details
Shows the details of the test definition run, including its Duration, Execution Path, the Execution Definition Run ID of the execution definition run that included the test definition run, and any Warnings/Errors. This tab also allows you to change the status of the test definition run. This option is useful if you need to manually overrule the status of a test run. Check the Hide Passed check box below the Assigned Test Definitions in the Execution Definition Run Results Dialog to show all test definitions. The default setting shows only the not passed test definitions to enhance performance. Only a part of the total test definitions have to be displayed. Additionally the information presented is of more use to the viewer. All parent nodes are displayed with the full status information. When a manual status change is performed, the details of the change are reflected in this tab's Status, Status Changed On, Status Changed By, Previous Status, and Status Change Comment fields. Only displayed for SilkTest, SilkPerformer, and manual test definitions. This tab includes details that are specific to the selected test definition type. For example, when a SilkTest test definition is selected, this view includes the selected test case, test data, and any warnings that were displayed during the test run. Lists all files that were generated by this test run, along with file sizes. The names of SilkTest .rex files act as download links. Once downloaded, these files can be viewed directly in a text editor. The upper table lists files that are associated with the test definition, such as result files or manually uploaded files for manual test definitions. The lower table lists files that are associated with the execution definition, for example execution log files or code analysis results. This tab also contains a button to download all result files: Download All Files Download all result files generated by the test definition run as a zipped package. Lists all messages that were generated by this test run, along with the severity of the messages.
Specific
Files
Messages
Messages that are associated with an execution definition as a whole, and not to one of the individual test definitions, can be viewed in the Projects unit (Activities tab/ Messages tab). Success Conditions Only displayed for automated test definitions. This tab shows all the success conditions that were defined for the test during the test planning process (Test Plan unit, Properties tab) and the result values from the execution run. Success conditions are used to determine if a test is successful or if it has failed. Data Driven Only displayed for data-driven test definitions using the option of having a single test definition for all data rows of the data set. This tab lists the status of each instance (data
713
Attributes Parameters
row) run of the test definition. Clicking an instance brings up another instance of the Test Definition Run Results dialog with run details of the selected instance. Any attributes that have been configured for the test definition. Any parameters that have been configured for the test definition.
The following table lists the UI elements that are used to step through the test definition results of an execution run. These elements are only visible when accessing the Test Definition Run Results dialog from an execution definition.
Item Description
Skip Passed
Used to determine which test definition run results should be displayed when browsing using the Previous Result and Next Result buttons. Checking this option only displays test definitions with a status other than Passed. < Previous Result Jumps to the result details of the previous test definition in the selected execution definition run. Next Result > Jumps to the result details of the next test definition in the selected execution definition run. Related Concepts Execution Dependency Configuration Related Procedures Viewing Test Execution Details Configuring Execution Dependencies Related Reference Activities Page Execution Runs Tab Test Plan Runs tab Execution Definition Run Results Dialog
714
715
Name Product name Statements Total statements Packages (histogram bar view) Total percentage of packages that are covered Number of covered packages (in green) Classes Number of uncovered packages (in red) (histogram bar view) Total percentage of classes that are covered Number of covered classes (in green) Methods Number of uncovered classes (in red) (histogram bar view) Total percentage of methods that are covered Number of covered methods (in green) Number of uncovered methods (in red) Package level view displays a list of covered and not-covered classes for specific products and product builds. By clicking a class name in Package view you can drill down to view code-coverage information for the methods that are included in that class. The following attributes are displayed for selected packages in Product view, across multiple rows:
Item Description
Name Package name Statements Total statements Classes (histogram bar view) (histogram bar view) Total percentage of classes that are covered Number of covered classes (in green) Number of uncovered classes (in red) Methods (histogram bar view) (histogram bar view) Total percentage of methods that are covered Number of covered methods (in green) Number of uncovered methods (in red) Class level view displays a list of covered and not-covered methods for specific products and product builds. The following attributes are displayed for selected methods in Class view, across multiple rows:
Item Description
716
Covered
Covered status of method (True indicates that the method is covered. False indicates that the method is not covered.)
Note: When the Details tab includes more elements than can be displayed at once without impacting response time, elements are displayed in increments. Page number links at the bottom of the tab allow you to browse through the elements included on the tab one page at a time. To display all elements as a single list, select the [All] link. Related Concepts Code Coverage Analysis Related Procedures Viewing Code-Coverage Information for Packages Analyzing Code Coverage Related Reference Execution Deployment tab
717
Product for which code analysis information is required. Version of product for which code analysis information is required. Enter criteria for filtering the packages. For example, entering the string published will display only packages that contain the string published in their names. Packages Product packages that are to be analyzed. Classes Classes from package that are to be analyzed. Add Click this button to add the classes for code-coverage analysis. Related Concepts Code Coverage Analysis Related Procedures Viewing Code-Coverage Information for Packages Analyzing Code Coverage Related Reference Execution Deployment tab
718
719
Date and time when issue details were updated. Number of issues in the selected project, database, or product that have a status of 'Open'. Number of issues in the selected project, database, or product that have a status of 'Fixed'. Number of issues in the selected project, database, or product that have a status of 'Verified'. Number of issues in the selected project, database, or product that have a status of 'Closed'. Number of issues in the selected project, database, or product that have a status of 'Deferred'.
Related Concepts Issue Management SilkCentral Issue Manager Related Procedures Viewing Issue Statistics in Document View Tracking Issues
720
Issues tab
Test Manager Issues Issues The Issues tab lists issues from both internal and external databases that have been configured for the selected project.
Item Description
Calendar Allows you to specify a time-frame for which issues should be reported. Update Click the time-frame date link to expand the calendar tool. Updates the Issues View based on calendar changes.
Related Concepts Issue Management SilkCentral Issue Manager Related Procedures Viewing Issue Statistics in Details View Specifying a Calendar Range Tracking Issues Related Reference Calendar Tool
721
Calendar Tool
The calendar tool provides the following features: , These buttons move the time frame forward or backward in time at an interval roughly equivalent to the current timeframe. For example, if the current timeframe encompasses about 1 week, clicking will advance the timeframe into the future one week. Increases the timeframe by 50 percent so that more test executions are included in the list. Decreases the timeframe by 50 percent so that fewer test executions are included in the list. Moves the selected timeframe backward or forward one day. Moves the selected timeframe backward or forward one week. Moves the selected timeframe backward or forward one month. Moves the selected timeframe backward or forward one quarter. Sets the past 7 days as the selected timeframe. Sets the past 31 days as the selected timeframe.
722
723
Report name Report ID Description Created On Created By Changed On Changed By Renderer Default Tab Edit Add Subreport Report Templates
Name of the report (customizable) System-defined identifier of the report A description of the report (customizable) Date the report was created (default reports are created upon creation of and connection to a database) User who created the report (default reports are created by the user Admin) Date the report was last modified. User who last modified the report. Report template that is currently assigned to the report. Tab you are directed to when you select this report from one of the context-sensitive report lists. Click to open the Edit Report dialog box. Click to add a subreport to the report. The available pre-installed report templates are:
Report Template Download Type Description
Download Excel Report Template Download BIRT Report Template Download as CSV Download as XML
You receive an MS Excel file with a sheet named DATA that contains the data (for example, in CSV format). This is the only affected sheet in the template, so you can specify information in adjoining sheets (for example, diagrams). You receive the report data as a generic BIRT report template (empty). The datasource is already configured.
You receive the report data as a CSV file. Depending on your local settings, you will receive , or ; as the delimiter character. The date is also formatted based on user settings. You receive the report data as XML. The advantage of this approach over CSV is that you retain all subreport data. Accessing data outside of Test Manager - You can call a specific URL that offers the report data using the following format: http://server/servicesExchange? hid=reportData&userName=<username>&passWord=<password>&reportFilterID=<ID of the report>&type=<csv|xml>
724
Related Concepts Report Generation Related Procedures Creating New Reports Editing Report Properties Creating Reports Generating Reports Managing Reports
725
726
727
728
Report tab
Test Manager Reports Report The Report tab is used to display data as a formatted report. If you have not yet assigned a template for your report, you can select one in the Report tab. A list box provides a selection of all available report templates. In addition to many system-installed templates, any custom report templates that may have been uploaded from Administration Reports Report Templates are also available here (see SilkCentral Administration Module documentation for details on setting up and uploading custom report templates). Alternatively, you can download an existing template by selecting the Properties tab, and then clicking the download link that corresponds to the report format you are working with (Excel, BIRT, CSV, or XML). From there you can customize the template to your needs. Note: Reports are cached to improve the performance of reporting. Click the Update button to update the report data immediately. Related Concepts Report Generation Related Procedures Creating New Reports Creating Reports Generating Reports Managing Reports
729
Enables creation of new report folders. Click New Child Folder to define a name and optional description for a new folder. The new folder displays as a child of the currently selected node in the Reports tree. New Child Report Enables creation of new reports. Click New Child Report to define a new report using the Create New Report dialog box. A new report displays as a child of the currently selected node in the Reports tree. Edit, Delete Edit and deletion of reports. Cut, Copy, Paste Cut, copy, and paste of reports within the Reports tree. Move Up, Move Down Move reports up or down within the Reports tree. Recently-Viewed Reports Lists MRU (Most Recently Used) reports by date/time in descending order. Select a report name from the list to advance to that report. Each time a report is accessed (by clicking the Data, Chart, or Report tab) that report is added to the top of the list box. Accessing a report's Properties or Parameters tab does not result in that report being added to the Recently-Viewed Reports list. The Recently-Viewed Reports list is empty for new users and users who have not yet generated a report. The number of reports that displays in this list can be configured by your administrator. See SilkCentral administrator help for details. Related Concepts Report Generation Related Procedures Accessing MRU (Most Recently Used) Reports Managing Reports
730
General Reference
This section contains general reference topics provided with SilkCentral Test Manager. In This Section HTML Support for Description Text Boxes Test Manager description text-boxes support HTML-formatted text. Multi-Select Functionality for Test Plan Elements The Contents page (Test Plan Contents) and the Steps page in the Test Plan unit (Test Plan Steps) support standard Windows Explorer style multi-select functionality for child test plan elements. SQL Functions for Custom Reports This table lists all available function placeholders.
731
Requirements Test containers Test folders Test definitions Test steps (Description and Expected Results) Execution folders Execution definitions
732
Text style, font, and font size settings Bold Italics Underline Text alignment (left, center, right, and justified) Numbered list Bullet list Indentation (decrease and increase) Font color Highlight color Insert/Edit HTML link, Remove HTML link Insert/Edit Image Edit HTML Source Clear Formatting
Related Procedures Creating New Reports Creating Requirements Attaching a File to a Requirement Adding Execution Definitions
733
Steps) support
Up
Move selection up
Extend selection up
Right Deselect A X C V N Pos1 Select first item End Ins Del F2 Select last item Insert Delete Edit Select up to first item Select down to last item Select All Cut Copy Paste New (Steps tab only)
The following mouse and keyboard combination functions are also available. Following these functions, actions like cut, copy. or paste can be performed on selected nodes:
CLICK: Select a row and remember it as the current row. CTRL+CLICK: Toggle the selection status of the clicked row and remember it as the current row. SHIFT+CLICK: Select the span from the currently-selected row to a newly selected row. CTRL+SHIFT+CLICK: When a row is already selected, adds span from current row to the clicked row to the
selection. If current row is not selected, this function removes the span from the current row to the clicked row from the selection and selects the clicked row.
ALT+CLICK: When a manual test step is clicked, this opens the Edit dialog for the step (note that the displayed
selection in the background does not change until the dialog is closed). When pasting test steps, the steps are inserted into the list at the first selected row. When no steps are selected (CTRL+Click the last selected row to do this), steps are pasted to the end of the list.
734
Related Concepts Test Plan Tree Test Plan Management Related Procedures Copying, Pasting, and Deleting Test Plan Elements Managing Test Plans Related Reference Test Plan Contents Tab Test Plan Steps Page
735
$TODAY
Gives the current systemdate (on the database server). You can also write $TODAY-1 (for yesterday) or $TODAY-7 (for a week ago) Returns the date (does not include the time) Converts the given string to a database date
Calculates the difference in days between the two given ${$DAYS[CreatedAt;$TODAY]} > 7 parameters. The two parameters can be a column within (returns the rows created within the last week) the table/view or $TODAY. Returns the week-number of the given parameter, which can be $TODAY or a column. Returns the month of the year as a number of the given parameter, which can be $TODAY or a column. Returns the year as a number of the given parameter, which can be $TODAY or a column. The ID of the currently logged in user. The name of the currently logged in user. The ID of the currently selected project.
$PROJECTNAME The name of the currently selected project. $REPORTNAME $REPORTID The name of the currently selected report. The ID of the currently selected report.
Sample Custom Report Below is the code of the pre-installed Requirement with Child Requirements report. With this report, a selected requirement is shown with its requirement ID. Full details regarding the requirements child requirements are displayed. Although not a custom report, this report is a helpful example because it makes use of the $PROJECTID function. It also includes two parameters, reqID (requirement ID) and reqProp_Obsolete_0 (show obsolete requirements).
SELECT r.ReqID, r.ReqCreated, r.ReqName, r.TreeOrder FROM RTM_V_Requirements r INNER JOIN TM_ReqTreePaths rtp ON (rtp.ReqNodeID_pk_fk = r.ReqID) WHERE rtp.ParentNodeID_pk_fk=${reqID|22322|Requirement ID} AND r.ProjectID = ${$PROJECTID} AND r.MarkedAsObsolete=${reqProp_Obsolete_0|0|Show obsolete Requirements} ORDER BY r.TreeOrder ASC
736
Related Concepts Report Generation Related Procedures Creating New Reports Creating Reports Generating Reports Managing Reports
737
APIs
Test Manager Help Documentation Refer to the Test Manager API Help for full details regarding Test Manager's APIs. Refer to the Test Manager API Specification for full details regarding Test Manager's interfaces.
738
Database Schemas
Test Manager Help Documentation Refer to the Test Manager Database Model for full details regarding Test Manager's database schemas.
739
Index
activities page columns, displaying and hiding, 575 columns, reordering, 581 columns, resizing, 582 default settings, 583 filtering test runs, 577 filters, removing, 580 sorting test runs, 584 test runs, grouping, 579 agent computers, 37 all related issues reports, requirements, 160 APIs, 738 application server, 37 architecture, 36 overview, 36 attachments test plans, 246 432 433 deleting, 431 editing descriptions, 434 viewing, 435 attributes creating custom, 211 253 deleting custom, 254 editing custom, 255 automated tests converting manual tests, 100 executing, 234 531 average page time reports, performance trend, 178 average transaction busy time reports, performance trend, 179 Borland software quality, 34 browser settings, 41 builds information files, 570 calendar range, 558 CaliberRM baseline handling, 82 test definitions, 83 change notification, 57 disabling, 250 enabling, 214 251 changes viewing recent, 463 chart server, 37 charts displaying, 207 621 740
printing, 625 removing, 626 code analysis overview, 189 enabling, 190 execution definitions, 634 latest builds and versions, 192 packages, viewing, 636 reports, 208 635 results compilation, 193 code coverage trend reports, 175 code-change impact reports, 186 Concurrent Version System CVS, 63 configuring SilkTest plan test properties, 393 .Net Explorer test properties, 394 JUnit test properties, 395 manual test properties, 397 NUnit test properties, 398 SilkPerformer test properties, 399 SilkTest test properties, 400 Windows scripting test properties, 401 coverage modes, 69 378 custom measure reports, performance trend, 181 custom reports SQL functions, 151 736 data sources data-driven tests, 60 configuring Excel or CSV, 263 configuring JDBC, 261 deleting, 265 downloading Excel file from, 266 synchronizing, 267 uploading Excel files to, 268 data-driven test types, 91 data-driven tests downloading CSV data, 439 properties, 440 database server, 37 emails change notification, 75 execution dependencies, 119 execution definition run comparison reports, 168 execution definition run errors
reports, 172 execution definitions, 120 adding, 224 533 assigning test definitions, filter, 227 476 assigning test definitions, grid view, 226 473 assigning test definitions, manually, 225 475 copying, 534 data-driven, 547 deleting, 535 dependencies, deleting, 488 dependencies, editing, 489 dependent, adding, 230 486 details, viewing, 235 469 dynamic hardware provisioning, 120 editing, 536 locating test definitions, 471 removing test definition assignments, 472 results, deleting, 468 schedules, 124 228 493 497 schedules, global, 496 test runs, deleting, 467 testers, removing, 479 tree view, 545 updating, 459 upgrading from previous versions, 121 viewing assigned, 462 execution runs status, changing, 466 execution server, 37 external ID test packages, 97 external requirements management tools integration, 78 external tools manual tests, 101 filters global, 53 applying, 628 containers, folders, 456 creating, 220 244 630 creating advanced, 221 629 creating global, 212 270 deleting, 631 deleting global, 272 editing, 632 editing global, 273 folders copying, 562 adding, 569 child folders, pasting as, 567 cutting, 563 deleting, 564 editing, 565 pasting, 566 741
sorting, 568 front-end server, 36 glossary Test Manager, 46 grid view creating execution definitions, 445 474 columns, displaying and hiding, 446 columns, reordering, 452 columns, resizing, 453 455 default settings, 454 filters, removing, 451 test definitions, filtering, 447 test definitions, grouping, 449 test definitions, linking to, 450 help system typographical conventions, 27 welcome, 29 history test plans, 458 IBM Rational ClearQuest, 61 installing & licensing Test Manager, 30 Issue Manager SilkCentral Issue Manager, 31 integration, 142 issue tracking profiles, 61 adding Bugzilla, 289 adding IBM Rational ClearQuest, 294 adding SilkCentral Issue Manager, 278 adding StarTeam, 284 Bugzilla, 288 deleting profiles, 276 282 287 292 297 editing Bugzilla, 291 editing IBM Rational ClearQuest, 296 editing SilkCentral Issue Manager, 281 editing StarTeam, 286 IBM Rational ClearQuest, 293 mapping issue states, 280 285 290 295 SilkCentral Issue Manager, 277 StarTeam, 283 issues statistics, details view, 551 activities tab, 576 creating, 555 deleting, 557 external, 554 statistics, document view, 552 synchronizing internal and external, 559 issues per component reports, 185 JUnit tests editing, 420 keywords
reserved, 120 assigning, 232 481 creating, 483 folder execution, 121 removing, 484 virtual execution servers, 120 last executions deleting, 574 licensing access, 39 manual execution definitions, 128 manual testing step properties, custom, 56 Manual Testing Client, 135 attachments, 508 509 510 code analysis, 194 code analysis, enabling, 517 637 connection parameters, 503 executing tests, 518 execution definitions, 513 exporting execution packages, 520 installing, 521 internal issues, adding, 511 launching, 521 package build numbers, 514 package status, changing, 512 screengrabs, 507 settings, 504 test definitions, 524 test definitions, editing, 515 test results, uploading, 523 uninstalling, 521 upload preferences, 505 working offline, 525 manual tests reports, 165 aborting, 526 adding data source values, 437 adding testers, 480 automated tests, 442 custom step properties, creating, 215 257 custom step properties, deleting, 258 custom step properties, editing, 259 executing, 527 steps, editing, 443 manual tests, current run executing, 529 method coverage comparison reports, 176 Microsoft Office Import Tool, 72 Microsoft Visual SourceSafe
MSVSS, 63 NUnit tests editing, 421 obsolete requirements, 349 overall page time reports, performance trend, 183 overall transaction busy time reports, performance trend, 184 project management build information, 145 settings, 210 331 project overview reports, 154 projects CaliberRM, 367 selecting, 572 recent changes filters, 86 reports creating new, 149 bookmarking, 150 context-sensitive, 153 context-sensitive, execution, 601 context-sensitive, execution definition, 597 context-sensitive, requirements, 598 606 context-sensitive, test definitions, 599 context-sensitive, test plans, 610 creating, 199 587 603 607 611 customizing BIRT templates, 204 591 linking to queried data, 150 most recently viewed, 622 parameters, 202 623 PDF, viewing, 617 properties, 201 624 saving, 615 SQL queries, 203 589 605 609 613 subreports, adding, 205 619 subreports, deleting, 620 templates, downloading, 592 templates, removing, 614 templates, uploading, 616 viewing, 206 618 reports, document requirements, 159 reports, progress requirements, 158 test plans, 164 reports, status requirements, 157 test plans, 163 requirement history, 74 742
requirements properties, custom, 55 assign test definitions, 342 attaching files to, 219 335 attaching links to, 336 attachments, 68 attachments, deleting from, 337 attachments, editing descriptions, 338 attachments, viewing, 339 CaliberRM, 361 collapsing or expanding tree view, 377 creating, 217 341 creating child, 344 editing, 345 finding properties, 346 history, 354 IBM Rational RequisitePro, 362 integration, 58 integration, configuring custom properties, 356 integration, deleting custom properties, 357 integration, deleting property mapping, 371 integration, disabling, 372 integration, editing custom properties, 358 integration, editing external properties, 369 integration, editing property mapping, 373 integration, removing, 374 integration, synchronizing across tools, 375 integration, viewing external properties, 370 marking as obsolete, 349 removing test definition assignments, 350 replacing properties, 351 synchronizing, 79 synchronizing based on schedules, 375 375 Telelogic DOORS, 364 test coverage status, 70 test plans, 222 347 tree view, 67 types, 218 340 run comparison reports, 167 schedules exclusions, 124 definite runs, 124 definite runs, adding, 491 definite runs, deleting, 494 definite runs, editing, 495 exclusions, adding, 492 exclusions, deleting, 498 exclusions, editing, 499 schemas database, 739 Serena Version Manager PVCS, 63 setup and cleanup
test definitions, 125 setup and cleanup definitions execution definitions, 229 546 shortcuts multi-select, 734 SilkCentral Issue Manager, 61 SilkPerformer projects, 32 Performance Explorer, 33 SilkPerformer tests editing, 419 attended tests, 542 projects, downloading, 540 projects, opening, 543 properties, editing, 541 results, uploading, 544 test results, 538 539 SilkTest test definitions, 105 agent under test, 141 automated test execution, 140 data-driven tests, 139 logs, 137 test plans, 410 time-out settings, 138 SilkTest tests editing, 418 AUT host, adding, 478 source control profiles overview, 63 adding CVS, 310 adding MKS, 327 adding MSVSS, 314 adding PVCS, 305 adding StarTeam, 301 adding SVN, 319 adding UNC, 323 CVS, 309 deleting, 299 303 308 312 317 321 325 330 editing CVS, 311 editing MSVSS, 316 editing PVCS, 307 editing StarTeam, 302 editing SVN profiles, 320 329 editing UNC profiles, 324 MSVSS, 313 PVCS, 304 StarTeam, 300 SVN, 318 326 UNC, 322 StarTeam, 61 status calculation
743
test definitions, 121 Subversion SVN, 64 success conditions test plans, 93 editing, 422 test definition run comparison reports, 170 test definitions attributes, 54 assigning attributes, 242 386 assigning to requirements, 343 calculating status, 126 creating, 237 405 data-driven tests, 241 438 deleting attributes, 387 editing, 239 407 editing attributes, 388 executing trial runs, 408 find/replace properties, 424 Manual Testing Client, 102 parameters, 94 parameters, adding, 243 391 parameters, clearing, 392 parameters, creating custom, 402 parameters, editing, 390 requirements, 348 sorting on assigned tab, 353 Test definitions, 398 Test Manager logging in, 43 logging out, 44 Test Manager 8.0 reports, 155 test packages test plans, 96 creating, 240 404 test plans generating, 73 assigning requirements to test definitions, 245 381 containers, adding, 413 containers, adding links, 412 editing elements, 416 folders, adding, 415 locating assigned requirements, 382 management, 88 removing requirement assignments, 383 set default nodes, 429 sorting requirements, 384 test containers, editing, 427 test folders, modifying, 428 tree view, 89 tree view, expanding, 457 744
tour user interface, Test Manager, 24 user interface, Manual Testing Client, 129 Universal Naming Convention UNC, 64 updates build information, 146 Upload Manager issues, 108 using, 460 virtual configurations VMware Lab Manager, 118 What's New Test Manager, 20 Windows Script Host test types, 110 Java script sample, 113 log information, 112 parameter usage, 111 returning success information, 111 sample result file, 113 storing information in result file, 112 structure of output.xml, 112 supported script languages, 110 switches, 111 test properties, 110 VB script sample, 114 Windows Scripting Host tests editing, 423