Você está na página 1de 140

Informatica Training

DESIGNER

Transformation The transfer of data is called transformation A transformation is a repository object that generates, modifies, or passes data The Designer provides a set of transformations that perform specific functions Transformations in a mapping represent the operations the Informatica Server performs on the data. Data passes into and out of transformations through ports that you link in a mapping or mapplet. :

Transformation
Transformations can be active or passive An active transformation can change the number of rows that pass through it A passive transformation does not change the number of rows that pass through it, Transformations can be connected to the data flow An unconnected transformation is not connected to other transformations in the mapping It :is called within another transformation, and returns a value to that transformation.

Joiner Transformation

The Joiner transformation joins two related heterogeneous sources . . The combination of sources can be varied The Joiner transformation uses a condition that matches one or more pairs of ports between the two sources.

Joiner transformation
You can use the following sources: Two relational tables existing in separate databases Two flat files in potentially different file systems . Two different ODBC sources A relational table and an XML source A relational table and a flat file source Two instances of the same XML source

Creating a Joiner Transformation


In the Mapping Designer, choose Transformation-Create. Select the Joiner transformation

Drag all the desired input/output ports from the first source into the Joiner transformation

Edit transformation
Double-click the title bar of the Joiner transformation to open the Edit Transformations dialog box

Select the Ports tab.

Port tab

: Add default values for specific ports as necessary.

Setting the condition

Select the Condition tab and set the condition Click the Add button to add a condition.

Defining the Join Type


join is a relational operator that combines data from multiple tables into a single result set You define the join type on the Properties tab in the transformation The Joiner transformation supports the following types of joins: Normal Master Outer Detail Outer
: Full Outer

Joiner in Mapping
The Joiner transformation requires two input transformations from two separate pipelines. An input transformation is any transformation connected to the input ports of the current transformation. In the example, the Aggregator transformation and the Source Qualifier transformation are the input transformations for the Joiner transformation

Condition of joiner Transformation

You cannot use a Joiner transformation in the following situations: Both input pipelines originate from the same Source Qualifier transformation Both input pipelines originate from the same Normalizer transformation. Both input pipelines originate from the same Joiner transformation Either input pipeline contains an Update Strategy transformation

You connect a Sequence Generator transformation directly before the Joiner transformation Configuring the Joiner Transformation Master and detail source Type of join Condition of the join

The join Types In Joiner Transformation

The Joiner transformation supports the following join types Normal (Default) Master Outer Detail Outer Full Outer An example of a partial mapping with two Joiner transformations in the same target load order group and three pipelines to process the data in the correct order:

The Condition appears in the join condition row


Cache Directory Join Type Null Ordering in Master Null Ordering in Detail Tracing Level Joiner Data Cache Size. Joiner Index Cache Size Click OK. Choose Repository-Save to save changes to the mapping $PMCacheDir Normal Null Highest value Null Highest value Normal 2,000,000 bytes 1,000,000 bytes.

Source Qualifier transformation

The Source Qualifier represents the rows that the Informatica Server reads when it executes a session.

When you add a relational or a flat file source definition to a mapping, you need to connect it to a Source Qualifier transformation

Task of Source Qualifier transformation


Transformation type: Active Connected We can use the Source Qualifier to perform the following tasks: Join data originating from the same source database . Filter records when the Informatica Server reads source data . Specify an outer join rather than the default inner join . Specify sorted ports. Select only distinct values from the source. Create a custom query to issue a special SELECT statement for the Informatica Server to read source data.

Default Query of source qualifier

For relational sources, the Informatica Server generates a query for each Source Qualifier when it runs a session.  The default query is a SELECT statement for each source column used in the mapping The Informatica Server reads only the columns in Source Qualifier that are connected to another transformation.

Source Definition Connected to a Source Qualifier Transformation

To view the default query


To view the default query: From the Properties tab, select SQL Query . Click Generate SQL.

Click Cancel to exit Note: If you do not cancel the SQL query, the Informatica Server overrides the default query with the custom SQL query

Example of source qualifier transformation


you might see all the orders for the month, including order number, order amount, and customer name. The ORDERS table includes the order number and amount of each order, but not The customer name. To include the customer name, you need to join the ORDERS and CUSTOMERS tables.

Setting the properties to source qualifier


Edit Transformation Tab
Filter Condition Select distinct Tracing Level

Sorted port

Post sql

Pre sql

SQL Query
You can give query in the Source Qualifier transformation From the Properties tab, select SQL Query The SQL Editor displays. Click Generate SQL.

Sql query

Joining source data


You can use one Source Qualifier transformation to join data from multiple relational tables. These tables must be accessible from the same instance or database server. Use the Joiner transformation for heterogeneous sources and to join flat files

User define join

Join condition

Entering a Source Filter


You can enter a source filter to reduce the number of rows the Informatica Server queries. Do not include WHERE in the source filter. Filter condition

Sorted Ports
For information about using Sorted Ports to optimize join performance,

Sorted Ports

Aggregator Transformation
The Aggregator is an active transformation

The Aggregator transformation allows you to perform aggregate calculations, such as averages and sums. The Aggregator transformation is unlike the Expression transformation, in that You can use the Aggregator transformation to perform calculations on groups The Expression transformation permits you to perform calculations on a row-by-row basis only. you can use conditional clauses to filter rows, providing more flexibility than SQL language. The Informatica Server performs aggregate calculations as it reads, and stores necessary data group and row data in an aggregate cache.

Aggregator Transformation
On the aggregator transformation double clik edit transfromation
Edit transformation Aggregator Transformation

Components of the Aggregator Transformation

Aggregate expression Group by port Sorted input

Aggregate cache

Aggregate expression

An aggregate expression can include conditional clauses and non-aggregate functions. It can also include one aggregate function nested within another aggregate function, such as:

MAX( COUNT( ITEM )

Aggregate Functions:
 The aggregate functions can be used within an Aggregator transformation You can nest one aggregate function within another aggregate function. AVG COUNT

Aggregate Functions
FIRST LAST MEDIAN MAX MIN STDDEV PERCENTILE SUM VARIANCE

Conditional Clauses
You can use conditional clauses in the aggregate expression to reduce the number of rows used in the aggregation. Setting condition to aggregator transformation is as follows :

Double click to aggregator transformation edit transformation box opens

Edit transformation

Port tab in aggregator transformation


Click to port tab edit transformation box opens

output port for condition clauses


Enable the output port for condition clauses

Output port

Expression tab in aggregator transformation

Click on expression tab expression edit transformation opens for give any condition clauses .

Ports in aggregator transformation

Click the group by option for each column you want the Aggregator to use in creating groups.

Group port

Creating an Aggregator Transformation


In the Mapping Designer, choose Transformation-Create. Select the Aggregator transformation. Enter a name for the Aggregator, click Create. Then click Done Drag the desired ports to the Aggregator transformation Double-click the title bar of the transformation to open the Edit Transformations dialog box Select the Ports tab. Click the group by option for each column you want the Aggregator to use in creating groups.

Creating an Aggregator Transformation

Click Add and enter a name and data type for the aggregate expression port. Make the port an output port by clearing Input (I). Add default values for specific ports as necessary Select the Properties tab

Properties tab

Cache Directory

Tracing Level

Sorted Input

Aggregator Data Cache Size Aggregator Index Cache Size

Aggregator Transformation Tips


Use sorted input to decrease the use of aggregate caches.

Limit connected input/output or output ports

Filter before aggregating.

Rank transformation
The Rank transformation allows you to select only the top or bottom rank of data. The Rank transformation differs from the transformation functions MAX and MIN, to select a group of top or bottom values, not just one value. For example, you can use Rank to select the top 10 salespersons in a given territory The Rank transformation is Active Transforamtion Sample Mapping with a Rank Transformation

How To create rank transformation


Transformation-Create. Select the Rank transformation Click OK, and then click Done. Link columns from an input transformation to the Rank transformation. Click the Ports tab, and then select the Rank (R) option for the port used to measure ranks.

to create groups for ranked rows, select Group By for the port that defines the group

The Properties of Rank Transformation


Rank Transformation Properties : Enter a cache directory. Select the top or bottom rank. Select the input/output port that contains values used to determine the rank Select the number of rows falling within a rank. Define groups for ranks, such as the 10 least expensive products for each manufacturer.

Setting Properties to Rank Transformation


Click the Properties tab and select whether you want the top or bottom rank

For the Number of Ranks option, enter the number of rows you want to select for the rank. Change the following properties, if necessary: Cache directory Top/Bottom Tracing level Rank Data Cache Size Normal Default is 2,000,000 bytes.

Edit Transformation tab for the rank transformation


Edit Transformation tab : Port tab Properties tab Metadata expression tab

Port tab

Data type

Properties tab function

Top/Bottom

Number of Ranks Tracing level Normal

Rank Data Cache Size

The Properties of rank transformation


Number of Ranks 1

Top

Rank Index Cache Size 2,000,000 Tracing level Normal Rank Data Cache Size 4000,000

Example of rank transformation

Setting the properties to rank transformation


. The Rank transformation includes input or input/output ports connected to another transformation in the mapping. It also includes variable ports and a rank port.

Click to port tab

Port tab

Ports in a Rank Transformation


Give the field name as sal1

Ports in a Rank Transformation


Rank Transformation Ports

Properties in rank transformation


 click on the properties tab to set properties to rank transformation

Sequence Generator Transformation


The Sequence Generator transformation generates numeric values You can use the Sequence Generator to create unique primary key values, cycle through a sequential range of numbers The Sequence Generator transformation is a connected transformation. The Informatica Server generates a value each time a row enters a connected transformation, even if that value is not used When NEXTVAL is connected to the input port of another transformation, the Informatica Server generates a sequence of numbers When CURRVAL is connected to the input port of another transformation, the Informatica Server generates the NEXTVAL value plus one.

Sequence Generator Transformation


You can make a Sequence Generator reusable, and use it in multiple mappings You might reuse a Sequence Generator when you perform multiple loads to . a single target. If you have a large input file that you separate into three sessions running in parallel, you can use a Sequence Generator to generate primary key values. If you use different Sequence Generators, the Informatica Server might accidentally generate duplicate key values Instead, you can use the reusable Sequence Generator for all three sessions to provide a unique value for each target row.

Tasks with a Sequence Generator transformation


Create keys.

Replace missing values. .

Cycle through a sequential range of numbers.

Creating a Sequence Generator Transformation

In the Mapping Designer, select Transformation-Create. Select the Sequence Generator transformation.

Edit transformation

Double-click the title bar of the transformation to open the Edit Transformations dialog box

Properties tab
Select the Properties tab. Enter settings as necessary.

Click OK.

To generate new sequences during a session, connect the NEXTVAL port to at least one transformation in the mapping. Choose Repository-Save.

Properties tab
Click OK. To generate new sequences during a session, connect the NEXTVAL port to at least one transformation in the mapping
.

Choose Repository-Save.

Sequence Generator Ports


The Sequence Generator provides two output ports: NEXTVAL and CURRVAL.

Use the NEXTVAL port to generate a sequence of numbers by connecting . it to a transformation or target. You connect the NEXTVAL port to a downstream transformation to generate the sequence based on the Current Value and Increment By properties Connect NEXTVAL to multiple transformations to generate unique values for each row in each transformation you might connect NEXTVAL to two target tables in a mapping to generate unique primary key values.

NEXTVAL to Two Target Tables in a Mapping

you configure the Sequence Generator transformation as follows: Current Value = 1, Increment By = 1.

When you run the workflow, the Informatica Server generates the following primary key values for the T_ORDERS_PRIMARY andT_ORDERS_FOREIGN target tables:

Output for NEXTVAL in the sequence generator

T_ORDERS_PRIMA RY TABLE: PRIMARY KEY

T_ORDERS_FOREIGN TABLE: PRIMARY KEY

1 3 5 7 9

2 4 6 8 10

Sequence Generator and Expression Transformation

you configure the Sequence Generator transformation as follows: Current Value = 1, Increment By = 1.

Output
 key values for the T_ORDERS_PRIMARY and T_ORDERS_FOREIGN target tables:
.
T_ORDERS_PRIMARY TABLE: PRIMARY KEY T_ORDERS_FOREIGN TABLE: PRIMARY KEY

1 2 3 4 5

1 2 3 4 5

CURRVAL

CURRVAL is the NEXTVAL value plus one or NEXTVAL plus the Increment By value
.

You typically only connect the CURRVAL port when the NEXTVAL port is already connected to a downstream transformation

When a row enters the transformation connected to the CURRVAL port, the Informatica Server passes the last-created NEXTVAL value plus one.

Connecting CURRVAL and NEXTVAL Ports to a Target

you configure the Sequence Generator transformation as follows: Current Value = 1, Increment By = 1. When you run the workflow, the Informatica Server generates the following values for NEXTVAL and CURRVAL:

OUT PUT

When you run the workflow, the Informatica Server generates the following values for NEXTVAL and CURRVAL:
NEXTVAL
.

CURRVAL 2 3 4 5 6

1 2 3 4 5

If you connect the CURRVAL port without connecting the NEXTVAL port, the Informatica Server passes a constant value for each row.

Only the CURRVAL Port to a Target


For example, you configure the Sequence Generator transformation as follows

OUTPUT Current Value = 1, Increment By = 1.


CURRVAL 1
.

1 1 1 1

When you run the workflow, the Informatica Server generates the following constant values for CURRVAL:

Sequence Generator Transformation Properties The Sequence Generator is unique among all because you cannot add, edit, or delete its default ports (NEXTVAL and CURRVAL).
.

Properties tab

Normalizer Transformation
 It is an Active and Connected Transformation  It normalizes COBOL sources and relational sources allowing the user to organize the data as per the requirement  Primarily used for a COBOL source

Tasks performed by Normalizer


 Break out repeated data within a record into separate records  For each new record it creates a unique identifier  Can be used with relational sources to create multiple rows from a single row of data  It generates the following columns when we create the transformation 1) Generated key 2)Generated Column Id  Designer generates a port for each REDEFINES clause to specify the generated key which can be used as the primary key column in the target table  Designer generates a port for each OCCURS clause to specify the positional index within an OCCURS clause and can be used as the generated column ID to create a primary-foreign key relationship

Workflow
Workflow is a set of instruction that describes how and when to run tasks related to extraction, transformation and loading of data.

Architecture

source Source data

Power centre server

target Transformed data Instructions from metadata

repository

Workflow Manager
Work Folders Variables/ Events Tasks

Link

New Connection Features Connection Dialogs Replace Connections Register Server Assign to Workflows

Work flow

L.M

D.T.M

Load manager
1. Locks the workflow and reads workflow properties. 2. Reads the parameter file and expands workflow variables. 3. Creates the workflow log file. 4. Runs workflow tasks. 5. Distributes sessions to worker servers. 6. Starts the DTM to run sessions. 7. Runs sessions from master servers. 8. Sends post-session email if the DTM terminates abnormally.

Data Transformation manager


1. Fetches session and mapping metadata from the repository. 2. Creates and expands session variables. 3. Creates the session log file. 4. Verifies connection object permissions. 5. Runs pre-session shell commands. 6. Runs pre-session stored procedures and SQL. 7. Creates and runs mapping, reader, writer, and transformation threads to extract, transform, and load data. 8. Runs post-session stored procedures and SQL. 9. Runs post-session shell commands. 10. Sends post-session email.

Create a Workflow

Create a Workflow

Create a Workflow

Create a Workflow

Create a Workflow

New Objects (Tasks)


Tasks are now the default units of work for building the workflow. Global tasks are reusable across workflows. Local tasks are independent and self-contained within workflows.

Global Tasks  Sessions  Commands  Email

Local Tasks
Sessions Commands Email Decision Assignment Timer Control Event Raise Event Wait Worklet

Sessions (Updates)

General

Updated parameters

First page allows for options to treat conditional links attached to the object as AND/OR functionality. Also control option to fail the parent (container) if task fails or does not run. Disabling a task in a workflow allows the task to be skipped instead of having to remove it.

Sessions (Updates)

Updated connection resolution

Properties
Addition to this page is the option to choose a different connection to resolve $Source and $Target used in lookups. **Limitation - You can use $DBConnectionVariable but it will only resolve if the parameter is evaluated with the appropriate Source or Target Connection If used when sourcing and targeting flat-files the parameter will not resolve.

Sessions (Updates)
DropDrop-down for common settings

Config Object
This area is where typical allocation of memory occurs, log file options and error handling strategies. The upgrade here is with the concept of maintaining common setting in a config object which is setup separately and chosen from a drop-down list. drop-

Config Object
Setup for reusable Configuration object

Config Object
By creating a session configuration object you can cut down on repetitively assigning memory and log options. When migrating sessions you can optionally migrate configurations or maintain environment specific settings to be reassigned.

Sessions (Updates)
Filter

Sources
This area has primarily just been subdivided into sub tabs. The main addition is the filter dropdropdown that allows for either viewing all source object settings or by individual source object. Instead of filelist option it is either direct or indirect (filelist)

Sessions (Updates)
Filter

Targets
Same subdivisions as sources. However in the Properties sub tab is where the Target Options are defined. The writers sub tab preserves the legacy operation of writing to a flat-file while using flata database specific target (as opposed to the new flat-file target). flat-

Source Target Connections


Connection dialog

Connections
The drop-down list allows to select any available connection as well as specifically define the use dropof a Connection Variable

Sessions (Updates)
Choice of reusable or local command

Components
The area where commands or email unique to this object can be defined. You can alternately select a reusable task to use as well.

Non-Reusable Commands
Option for local or reusable

Name of command object

Components
Regardless of reusable or non-reusable it is necessary to name the object since there is potential nonto promote it.

Non-Reusable Commands

Error Control for multiple commands/tasks

Components
The properties tab allows for error control for commands/tasks

Non-Reusable Commands

Command editor

Components
The Command tab allows for editing of commands (obviously)

Reusable Commands

Reusable Command Selection Dialog

Components
A non-reusable command can be promoted to reusable or else you can select one from available nontasks.

Reusable Commands

Limitation

Components
Limitation a non-reusable and reusable command can resolve Server Variables but only from nonwithin the Components tab of the session. A command can NOT resolve Server Variables if used in an external Command Task. Understand before promoting. Task.

Sessions (Updates)
Filter for modifiable transformations Instance Filter

Transformations
The transformations area preserves all the previous capabilities. The update is the addition of the main instance filter and another filter button that only displays transformation where overrides can be performed. Also this is where you can input the NEW pre/post SQL commands.

Sessions (Updates)
Promote to reusable

Metadata column

Metadata Extensions
Metadata extensions are available to individual object/tasks or can be made reusable and global. These columns can not be auto-populated with functions but they do allow the extendibility for automore reporting options (e.g., Developer Name, Version, Creation Date, etc.)

Target

Target Properties
Target database connections: specify the location of target Truncate Target Tables: can truncate the tables before loading Deadlock Retry : Configure server to retry when writing to targets Drop and Recreate Indexes: Pre and Post session commands to optimize the query Bilk Loading : can specify bulk loading when loading to DB2, microsoft sql server ,oracle sysbase database

Command Tasks

Command
The command object can be created globally under the Task Developer. It can also be promoted here from within a mapping.

Command Tasks

Event Wait

Assignment

Email Tasks
Email text creation dialog BuiltBuilt-in Variables

Email
Email task is very similar to the command task since it can be either created in the Task Developer or promoted from a mapping. The properties tab allows for an expression editor for text creation utilizing the built-in variables. built-

Links and Conditions


Link condition

Defiinition
Links and their underlying conditions are what provide process control to the workflow. When an attached link condition resolves to TRUE then the attached object may begin processing. There can be no looping and links can only execute once per workflow. However more complex branching and decisions can be made by combining multiple links to a single object or branching into decision type paths. Each link has its own expression editor and can utilize upstream resolved object variables or user-defined variables for its own evaluation. user-

Links and Conditions

Object Variables

Object Variables The default set of object variables from a session can provide more information than just a status of Completed. More complex evaluation can be done for ErrorCode, StartTime, SrcSuccessRows, etc. In addition to the default object variables, User Defined variables can be created and populated via parameter files or changed in the workflow via Assignment tasks. Also any upstream task that has completed can have its variables utilized in downstream link conditions.

Workflow Variables
Variable Condition EndTime ErrorCode ErrorMsg FirstErrorCode FirstErrorMsg PrevTaskStatus SrcFailedRows SrcSuccessRo ws StartTime Status** TgtFailedRows TgtSuccessRo ws TotalTransError s Task Type Decision Task All tasks All tasks All tasks Session task Session task All tasks Session task Session task All tasks All tasks Session tasks Sessions Sessions Datatype Integer Date/time Integer Nstring* Integer Nstring* Integer Integer Integer Date/time Integer Integer Integer Integer ** Supported Status Returns ABORTED DISABLED FAILED NOTSTARTED STARTED STOPPED SUCCEEDED

* Variables of type Nstring can have a maximum length of 600 characters.

Workflow Variables
Edit Variables

UserUser-defined Variables
Variables are created at the container level much like the mappings. (Workflows=Mappings, Worklets=Mapplets). Once created values can be passed to objects within the same container for evaluation. (Assignment Task can modify/calculate variables)

Workflow Variables

PrePre-Defined Variable

User Defined Variables

UserUser-defined Variables
A user-defined variable can assist in more complex evaluations. In the above example, an userexternal parameter file contains the number of expected rows. This in turn is evaluated against the actual rows successfully read from an upstream session. $ signifies and is reserved for prepredefined variables. User defined variables should maintain $$ naming.

Link decision

Parameter File
Workflow variables:
[folder name.WF:workflow name]

Worklet variables:
[folder name.WF:workflow name.WT:worklet name]

Worklet variables in nested worklets:


[folder name.WF:workflow name.WT:worklet name.WT:worklet name...]

Session parameters, plus mapping parameters and variables:


[folder name.WF:workflow name.ST:session name] or [folder name.session name] or [session name]
Format
The format is slightly different going forward for declaring variables but legacy naming is preserved with the foldername.session

Assignment Task

Edit Variables

Usage
The assignment task allows for workflow variables to be manipulated, assigned or calculated and stored (either for the session run or as persistent) for downstream evaluation and condition resolution. The Expression Section allows for the selection of the appropriate variable to manipulate along with the assignment through an Expression Editor. In order to use it is necessary to first create a variable through the Edit Variables dialog

Event Task
Edit Events

Usage
Events are similar to variables. They are repository tags that are a little more flexible (cleaner) than dealing with indicator files (although event watch can be used with indicator files). Before utilizing the new functionality it is necessary to create these Event tags within the container that they will be used. Events can not be monitored across workflows or worklets (even if a worklet is part of the workflow).

Event Task
Event Raise

Event Wait

Usage
If using Event tags then an Event Raise is used in conjunction with an Event Wait. In the above example two branches are executed in parallel. The second session of the lower branch will remain in stasis until the upper branch completes triggering the event. The lower branches event wait task recognizes the event and allows for the second session to start.

Event Raise

Usage
To configure the Event Raise task the drop-down box allows for selection of the appropriate userdropuserdefined Event tag. This will create an entry in the repository for a matching event wait to look for.

Event Wait

Indicator File

User Defined Event

Usage
The event wait allows for configuration for an Event Raise (user-defined event) or existence (usercheck for an indicator file.

Event Wait

Resume/Restart Support FlatFlat-file Cleanup

Usage
The properties section of the Event Wait task allows for further definition of behavior. If your workflow has failed/suspended after Event Raise but before the Event Wait has resolved, then the Enable Past Events is able to recognize that the Event has happened already. If working with indicator files you have to ability to either delete the file or allow it to stay in case some downstream Event Waits are also keying on that file.

Decision Task

Usage
The decision task allows for True/False based branching of process ordering. The Decision task can home multiple conditions and therefore downstream links can be evaluated simply upon the Decision being True or False.

**Note it is possible to have the decision based on SUCCEEDED or FAILED of previous task, however if workflow is set to suspend on error than that branch is suspended and the decision wont trigger on a FAILED condition

Control Task

Usage
The control task is utilized in a branching manner to present a level of stoppage during the workflow. Consider if too many sessions have too many failed rows. The options allow for different levels such as failing at the object level to Aborting the whole workflow.

Timer Task

Usage
The timer task has two main ways to be utilized. The first way is by absolute time that is time evaluated by server time or a user-defined variable (that contains the date/time stamp to start). user-

Timer Task

Usage
The second usage is by Relative time that offers options of time calculated from when the process reached this (Timer) task, from the start of the container this task, or from the start of the absolute top-level workflow. top-

WORKLET
Object that represents a set of tasks It can contain any task available in the Workflow Manager You can run worklets inside a workflow. The workflow that contains the worklet is called the parent workflow Create a worklet when you want to reuse a set of workflow logic in several workflows.

Creating a Reusable Worklet

Creating a Non-Reusable Worklet


To create a non-reusable worklet: 1. In the Workflow Designer, open a workflow. 2. Choose Tasks-Create. 3. Select Worklet for the Task type. 4. Enter a name for the worklet. 5. Click Create. The Workflow Designer creates the worklet and adds it to the workspace. 6. Click Done.

Configuring Worklet Properties


Worklet variables. Use worklet variables to reference values and record information. You use worklet variables the same way you use workflow variables. You can assign a workflow variable to a worklet variable to override its initial value. Events. To use the Event-Wait and Event-Raise tasks in the worklet, you must first declare an event in the worklet properties. Metadata extension. Extend the metadata stored in the repository by associating information with repository objects.

Adding Tasks in Worklets


After you create a new worklet, add tasks by opening the worklet in the Worklet Designer. A worklet must contain a Start task. The Start task represents the beginning of a worklet. When you create a worklet, the Worklet Designer automatically creates a Start task for you. To add tasks to a non-reusable worklet: 1. Create a non-reusable worklet in the Workflow Designer workspace. 2. Right-click the worklet and choose Open Worklet. 3. The Worklet Designer opens so you can add tasks in the worklet. 4. Add tasks in the worklet by using the Tasks toolbar or choose Tasks-Create in the Worklet Designer. 5. Connect tasks with links.

Using Worklet Variables


Worklet variables are similar to workflow variables. A worklet has the same set of pre-defined variables as any task. You can also create userdefined worklet variables. Like user-defined workflow variables, user-defined worklet variables can be persistent or non-persistent You cannot use variables from the parent workflow in the worklet.

Workflow Monitor

Workflow Monitor Task View

Você também pode gostar