Você está na página 1de 134

Oracle BI Applications 7.

9: Implementation for Oracle EBS


Activity Guide

D55409GC10 Edition 1.0 October 2008 D56566

Author
Jim Sarokin

Copyright 2008, Oracle. All rights reserved. Disclaimer This document contains proprietary information and is protected by copyright and other intellectual property laws. You may copy and print this document solely for your own use in an Oracle training course. The document may not be modified or altered in any way. Except where your use constitutes "fair use" under copyright law, you may not use, share, download, upload, copy, print, display, perform, reproduce, publish, license, post, transmit, or distribute this document in whole or in part without the express authorization of Oracle. The information contained in this document is subject to change without notice. If you find any problems in the document, please report them in writing to: Oracle University, 500 Oracle Parkway, Redwood Shores, California 94065 USA. This document is not warranted to be error-free. Restricted Rights Notice

Technical Contributors and Reviewers


Dan Hilldale Mitravinda Kolachalam Manmohit Saggi Phillip Scott Kasturi Shekhar Albert Walker Jr.

Editors
Raj Kumar Daniel Milne Joyce Raftery

If this documentation is delivered to the United States Government or anyone using the documentation on behalf of the United States Government, the following notice is applicable: U.S. GOVERNMENT RIGHTS The U.S. Governments rights to use, modify, reproduce, release, perform, display, or disclose these training materials are restricted by the terms of the applicable Oracle license agreement and/or the applicable U.S. Government contract. Trademark Notice Oracle is a registered trademark of Oracle Corporation and/or its affiliates. Other names may be trademarks of their respective owners.

Publisher
Giri Venugopal

Contents

Practice 2-1: Matching Oracle Business Analytics Warehouse Components................... 5 Solutions 2-1: Matching the Oracle Business Analytics Warehouse Components ........... 7 Practice 2-2: Locating the Oracle Business Analytics Warehouse Components.............. 8 Solutions 2-2: Locating Oracle Business Analytics Warehouse Components .................. 9 Practice 4-1 Configuring the Training Environment......................................................... 11 Practice 5-1: Exploring Oracle BI ETL Metadata............................................................. 27 Solutions 5-1: Exploring Oracle BI ETL Metadata........................................................... 35 Practice 6-1: Working with Informatica Designer ............................................................ 39 Practice 7-1: Creating and Running an Informatica Workflow ........................................ 47 Practice 8-1 Exploring a Prebuilt SDE Mapping.............................................................. 51 Practice 8-2 Exploring a Prebuilt SIL Mapping................................................................ 57 Practice 9-1: Exploring the DAC...................................................................................... 63 Solutions 9-1: Exploring the DAC .................................................................................... 73 Practice 10-1: Configuring Common Areas and Dimensions Before Running a Full Load................................................................................................... 77 Practice 10-2: Configuring General Ledger Account Hierarchies ................................... 80 Practice 10-3: Mapping Oracle GL Natural Accounts to Group Account Numbers......... 85 Practice 10-4: Creating a New Metric Based on a New Group Account Number ........... 89 Practice 11-1: Customizing DAC Metadata..................................................................... 93 Practice 13-1: Creating a Custom SDE Mapping ............................................................ 99 Practice 13-2: Creating a Custom SIL Mapping ............................................................ 103 Practice 13-3: Adding DAC Tasks and Running Customized ETL................................ 107 Practice 14-1: Adding a New Dimension in the OBAW ................................................. 115 Practice 14-2: Creating an SDE Mapping to Load the Dimension Staging Table......... 118 Practice 14-3: Creating an SIL Mapping to Load the Dimension Table ........................ 122 Practice 14-4: Creating an SDE Mapping to Load the Fact Staging Table................... 125 Practice 14-5: Creating an SIL Mapping to Load the Fact Table .................................. 127 Practice 14-6: Adding DAC Tasks and Running Customized ETL................................ 130

iii

iv

Lesson 2: Oracle Business Intelligence Applications Architecture Overview

Practice 2-1: Matching Oracle Business Analytics Warehouse Components


Goals To match the core Oracle Business Analytics Warehouse components with their corresponding functions You have received your Oracle Business Analytics Warehouse software, and you begin by validating your knowledge of its components. You will have a list of the components and terms related to the software you are about to deploy. 1015 minutes

Scenario

Outcome

Time

Instructions:
1. Match the Oracle Business Analytics Warehouse component names on the left to their descriptions on the right. Write the appropriate letter of the description in the blank.

Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 2: Oracle Business Intelligence Applications Architecture Overview

Component 1. Mapping Designer 2. Oracle Business Analytics Warehouse Database 3. Analytical System 4. Informatica Designer 5. DAC 6. Informatica Repository 7. Dimensional Schema 8. Fact Table 9. Transformations 10. Batch 11. Repository Manager 12. DW Database Server Machine 13. Administrator Workstation Machine 14. Dimension Table 15. ETL 16. Transactional System 17. Mappings

Description a. Contains the components used to create and administer the data warehouse b. Extract, transform, and load data c. Program that runs ETL to load the data warehouse d. Transform data between source and target e. Online analytical processing f. Relational tables, custom-built for the Oracle Business Analytics Warehouse, that store mappings, transformations, and other metadata

g. Enables you to administer the Informatica Repository h. Enables you to create and modify the Informatica mappings, transformations, and target tables i. j. k. l. Online transaction processing Contains the data warehouse Format for data that allows for effective querying Table in a dimensional schema that stores descriptions; has a single primary key

m. Central table in a dimensional schema; the only table with multiple joins to other tables n. Used to run, schedule, manage, and configure ETL o. Set of instructions for retrieving data, performing computations, and loading data p. Used to create mappings q. Database composed of dimensional schemas that stores data warehouse data

Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 2: Oracle Business Intelligence Applications Architecture Overview

Solutions 2-1: Matching the Oracle Business Analytics Warehouse Components


Answers:
1. Match the Oracle Business Analytics Warehouse component names on the left to their descriptions on the right. Write the appropriate letter of the description in the blank.
Component p q 1. Mapping Designer 2. Oracle Business Analytics Warehouse Database e h n f k 3. Analytical System 4. Informatica Designer 5. DAC 6. Informatica Repository 7. Dimensional Schema a. Contains the components used to create and administer the data warehouse b. Extract, transform, and load data c. Program that runs ETL to load the data warehouse d. Transform data between source and target e. Online analytical processing f. Relational tables, custom-built for Oracle Business Analytics Warehouse, that store mappings, transformations, and other metadata g. Enables you to administer the Informatica Repository h. Enables you to create and modify the Informatica mappings, transformations, and target tables i. Online transaction processing j. Contains the Data Warehouse k. Format for data that allows for effective querying l. Table in a dimensional schema that stores descriptions; has a single primary key m. Central table in a dimensional schema; the only table with multiple joins to other tables n. Used to run, schedule, manage, and configure ETL o. Set of instructions for retrieving data, performing computations, and loading data p. Used to create mappings q. Database composed of dimensional schemas that stores data warehouse data Description

m 8. Fact Table d 9. Transformations

c 10. Batch g 11. Repository Manager j 12. DW Database Server Machine

a 13. Administrator Workstation Machine l 14. Dimension Table

b 15. ETL i 16. Transactional System

o 17. Mapping

Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 2: Oracle Business Intelligence Applications Architecture Overview

Practice 2-2: Locating the Oracle Business Analytics Warehouse Components


Goals Scenario To determine the recommended location for each core component You have received your Oracle Business Analytics Warehouse software, and you begin by validating your knowledge of its components. You will have a list of the component locations for the Oracle Business Analytics Warehouse. 1015 minutes

Outcome

Time

Instructions:
1. Using the list provided, place a check mark in the appropriate column for the recommended locations for setup and installation of each item.
ETL Servers Component DAC Client DAC Server DAC Repository Oracle Business Analytics Warehouse Database Informatica Integration Services Informatica Repository Service Informatica Workflow Manager Informatica Client (Repository Manager, Designer, and so on) Informatica Repository Oracle Business Analytics Warehouse Tables ETL Clients Component OBAW Database Component ETL Repositories Component

Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 2: Oracle Business Intelligence Applications Architecture Overview

Solutions 2-2: Locating Oracle Business Analytics Warehouse Components


Answers:
1. Using the list provided, place a check mark in the appropriate column for the recommended locations for setup and installation of each item.
OBAW Database Component ETL Repositories Component

ETL Servers Component DAC Client DAC Server DAC Repository Oracle Business Analytics Warehouse Database Informatica Integration Services Informatica Repository Service Informatica Workflow Manager Informatica Client (Repository Manager, Designer, and so on) Informatica Repository Oracle Business Analytics Warehouse Tables

ETL Clients Component

Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 2: Oracle Business Intelligence Applications Architecture Overview

10

Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 4: Installing Oracle BI Applications

Practice 4-1 Configuring the Training Environment


Goals Scenario To configure the training environment before you populate and customize the Oracle Business Analytics Warehouse Oracle Business Intelligence platform and Oracle BI Applications have already been installed in this training environment. This includes Informatica PowerCenter, the Data Warehouse Application Console (DAC), two database schemas (DAC and INFA) that were created during the installation process, and one database schema (BIAPPS) that is included as part of the training environment: DAC: contains DAC repository tables INFA: contains Informatica repository tables BIAPPS: contains Oracle E-Business Suite source data In this practice you perform additional, post-installation configuration of the DAC and Informatica in your environment. Performing these tasks is a convenient debugging technique, as most configuration issues arise from steps performed in this practice. Time 4050 minutes

Instructions:
1. To review the steps that were used to install the Oracle BI components, double-click the Installing shortcut on the desktop to view demonstrations of the Oracle BI Applications and Informatica installation processes. When you are done with the demonstrations, close the browser. 2. Modify the Informatica PowerCenter initialization file to disable validation of data code pages. a. In Windows Explorer, navigate to C:\Informatica\PowerCenter8.1.1\client\bin. b. Double-click powrmart.ini to open the file using Notepad. c. Locate the following entry at the end of the file: [Code Pages]
ValidateDataCodePages=Yes

d. Disable the validation of data code pages by changing the value to No. The data code page is disabled here because the code page is identical for both the source and target databases in this training environment. It is not necessary to validate the code page when the data is moved between compatible source and target databases:
[Code Pages]

Oracle BI Applications 7.9: Implementation for Oracle EBS

11

Lesson 4: Installing Oracle BI Applications

ValidateDataCodePages=No

e. Select File > Save to save the powrmart.ini file after the modification. f. Select File > Exit. 3. Copy source files and lookup files. You need to copy source files and lookup files from the Oracle Business Intelligence Applications installation directory to the Informatica directory on the Informatica PowerCenter Services machine. In this training environment, everything is installed on one machine, so you copy between directories. a. Navigate to C:\OracleBI\dwrep\Informatica\SrcFiles and copy all the source files in the directory. b. Paste all the source files into C:\Informatica\PowerCenter8.1.1\server\infa_shared\SrcFiles. Replace any existing files. c. Copy all the lookup files from C:\OracleBI\dwrep\Informatica\LkpFiles and paste into C:\Informatica\PowerCenter8.1.1\server\infa_shared\LkpFiles. 4. Identify the machine name. You will be using this name throughout this course. a. On the desktop, right-click the My Computer icon and select Properties. b. Select the Computer Name tab and make a note of the machine name, referred to hereafter as <machine name>.

c. Click OK to close the System Properties window. 5. Restore the Oracle Business Intelligence prebuilt repository. An Informatica repository file named Oracle_BI_DW_Base.rep is installed into the OracleBI\dwrep\Informatica\Repository directory during the Oracle Business Intelligence Applications installation. You use the restore option in Informatica PowerCenter Administration Console to load this prebuilt Oracle_BI_DW_Base repository. a. Navigate to C:\OracleBI\dwrep\Informatica\Repository. b. Copy the file Oracle_BI_DW_Base.rep and paste it into
C:\Informatica\PowerCenter8.1.1\server\infa_shared\Backup.
12 Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 4: Installing Oracle BI Applications

c. Double-click the Services icon on the desktop and verify that the service Informatica Services 8.1.1 SP4 is started. If it is not started, start it and wait a minute before continuing with the next step. d. Select Start > Programs > Informatica PowerCenter 8.1.1 > Services > Launch Admin Console. e. Log in as admin with password admin. f. In the left pane, select the Oracle_BI_DW_Base repository service that was created during the installation process. g. In the right pane, select the Properties tab h. Click Edit in the General properties area. i. Set the Operating Mode to Exclusive and click OK. j. Click Yes when prompted to restart repository service. k. Verify that Complete is selected, and then click OK. l. After a few minutes, verify that the service is running.

m. Select Actions > Delete Contents. n. In the Delete contents for Oracle_BI_DW_Base dialog box, enter Administrator as the repository username and Administrator as the password, and then click OK. o. Wait for a Succeeded message. The repository now has no content.

p. Select Actions > Restore Contents.

Oracle BI Applications 7.9: Implementation for Oracle EBS

13

Lesson 4: Installing Oracle BI Applications

q. In the Restore Contents dialog box, select Oracle_BI_DW_Base from the Select backup file list and select the Restore as new check box.

r. Click OK to start the restore process. This takes about 5-10 minutes to complete. s. Verify that the restore completes successfully. The screenshot below shows a partial view.

t. Click Save to save the log file to the desktop. u. Click Close. 6. Promote the repository to a global repository. When a repository is restored, the repository becomes a stand-alone repository. After restoring the repository, you need to promote it to a global repository. a. In the Properties tab, click Edit in the General properties area.

14

Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 4: Installing Oracle BI Applications

b. Change the OperatingMode value to Normal.

c. Click OK. If prompted, enter Administrator for the repository username and password. d. Verify that the repository service Oracle_BI_DW_Base is enabled and running in normal mode.

7. Set PowerCenter Integration Services custom properties. a. In the left pane, select Oracle_BI_DW_Base_Integration_Service. b. Display the Properties tab. c. In the right pane, scroll down to the Custom properties area and click Edit. d. Create the custom properties in the table below by clicking Add to display new Name and Value fields.
Custom Properties Name SiebelUnicodeDBFlag ServerPort overrideMpltVarWithMapVar SiebelUnicodeDB Custom Properties Value No 4006 Yes biapps@orcl obaw@orcl

e. Verify the results. Be sure to leave a space between biapps@orcl and obaw@orcl:

f. Click OK. You should receive a message that integration service properties were updated. g. Log out of the Administration Console and close the browser. 8. Set up the DAC client. a. Copy Hibernate libraries to the appropriate DAC directories. To run the DAC client or DAC server, you need to have libraries from an open source software product called Hibernate. Hibernate libraries are not installed as part of Oracle Business Intelligence Applications 7.9.5, but have to be downloaded from the Hibernate Web site. Oracle recommends that you download Hibernate Core Package Version 3.2.x GA or later. Newer versions of Hibernate
Oracle BI Applications 7.9: Implementation for Oracle EBS 15

Lesson 4: Installing Oracle BI Applications

Core Package 3.2 are now available (for example, Hibernate Core Package Version 3.2.5 GA). DAC is supported on the libraries of these versions also. You can download the Hibernate Core Package from http://www.hibernate.org. For this training, the Hibernate libraries have already been downloaded to your training environment. i. Navigate to C:\PracticeFiles\hibernate-3.2. ii. Copy the Hibernate files from the C:\PracticeFiles\hibernate-3.2 directory to the C:\OracleBI\DAC directory as described in the following table. Copy only the files described in the table. You do not need to copy any of the other files in the C:\PracticeFiles\hibernate-3.2 directory.
Files *.jar hibernate3.jar hibernate-mapping-3.0.dtd Copy from \hibernate-3.2\lib \hibernate-3.2 \hibernate-3.2\src\org\hibernate Copy to \DAC\lib \DAC\lib \DAC \DAC

hibernate-configuration-3.0.dtd \hibernate-3.2\src\org\hibernate

b. Install JDBC drivers for DAC database connectivity. You must install the appropriate JDBC driver in the DAC\lib directory to enable DAC database connectivity. In this training environment you copy the driver from the oracle database directory. i. Navigate to C:\oracle\product\11.1.0\db_1\sqldeveloper\jdbc\lib. ii. Copy ojdbc14.jar. iii. Paste ojdbc14.jar into C:\OracleBI\DAC\lib. c. Open the DAC config.bat file and verify the connection configuration for the DAC repository: i. In C:\OracleBI\DAC directory, right-click config.bat and select Edit. ii. Verify that the JAVA_HOME variable points to the directory where Java SDK is installed. In this training environment it should be set JAVA_HOME= C:\jdk1.5.0_12. Make sure there are no spaces in the path reference. iii. Verify that the DAC_HOME variable points to the directory where DAC is installed. In this training environment it should be set DAC_HOME= C:\OracleBI\DAC. iv. Close config.bat. 9. Enable DAC client communication with Informatica PowerCenter. The DAC client uses the Informatica pmrep and pmcmd command-line programs to communicate with Informatica PowerCenter. The DAC client uses pmrep to synchronize DAC tasks with Informatica workflows and to keep the DAC task source and target tables information up-to-date. a. Install the Informatica pmcmd and pmrep command-line programs. i. The pmrep program is installed in the Informatica PowerCenter Client and Informatica PowerCenter Services bin directories. Because of the requirement to co-locate the DAC client with the PowerCenter client, the pmrep program is already available on the machine for the DAC client to use. ii. The pmcmd program is installed in the PowerCenter Services bin directories. In this training environment, PowerCenter Services 8.1.1 SP 4 has been installed on the same machine as the DAC client and PowerCenter client 8.1.1 SP4. Copy the pmcmd.exe program from the C:\Informatica\PowerCenter8.1.1\server\bin directory to the C:\Informatica\PowerCenter8.1.1\client\bin directory.
16 Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 4: Installing Oracle BI Applications

b. Set environment variables for the DAC client. In order for the DAC client to be able to use the pmrep and pmcmd programs, you need to define the path of the Informatica domain file domains.infa. i. Verify that domains.infa file is located in C:\Informatica\PowerCenter8.1.1. ii. On the Desktop, right-click My Computer and select Properties. iii. Select Advanced > Environment Variables. iv. Click New to create new system variable. v. Name the variable INFA_DOMAINS_FILE. vi. Set the variable value to C:\Informatica\PowerCenter8.1.1\domains.infa. The path should include the name of the file.

vii. Click OK to close the dialog box. viii. In the Environment Variables dialog box, under System variables, select the path system variable and click Edit. ix. In the Edit System Variable dialog box, add the directory path to Informatica PowerCenter binaries to the end of the path environment variable. The screenshot shows only a partial view. Be sure to include the semicolon before the directory path:

Oracle BI Applications 7.9: Implementation for Oracle EBS

17

Lesson 4: Installing Oracle BI Applications

;C:\Informatica\PowerCenter8.1.1\client\bin

x. Click OK to close the Edit System Variable dialog box. xi. Click OK to close the Environment Variables dialog box. xii. Click OK to close the System Properties dialog box. c. Verify that the DAC client is able to use pmrep and pmcmd. i. From a Windows command prompt, execute pmrep and then pmcmd. The test is successful if the pmrep and pmcmd prompts appear.

18

Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 4: Installing Oracle BI Applications

ii.

Close the command window.

10. Create a DAC connection. A DAC connection is a stored set of login details that enable you to log into the DAC client and connect to the DAC repository. a. To start the DAC client, select Start > Programs > Oracle Business Intelligence > Oracle DAC > DAC Client. b. In the Login dialog box, select Configure. c. In the Configuring dialog box, select Create Connection, and then click Next. d. Enter the appropriate connection details as specified in the table:
Name Connection type Instance Database Host Database Port DAC Oracle (Thin) orcl <machine name> 1521

e. f. g. h. i. j. k.

Click Test Connection. Enter dac as the Table owner name and password. Click Test and verify that the connection was successfully established. Click Close. Select Apply. Click Finish to save the connection details and return to the login dialog box. If necessary, select the DAC connection from the Connection list, enter dac as the table owner name and password, and click Login. The DAC client opens. Note: Typically, when you log into DAC and connect to a DAC repository for the first time, the DAC detects that the DAC schema does not exist in the database and you are asked whether you want to create a repository. For this training, the DAC schema and repository tables have already been created in your training environment, so you are not prompted to create the DAC repository.

11. Importing DAC metadata. Typically, the next step would be to use the DAC client to import DAC metadata into the DAC repository schema. As part of this process, you would specify the source system applications (Oracle 11.5.10, Siebel 8.0, and so on) for which you import the ETL metadata. Because importing DAC metadata is a time-consuming process, this step has been omitted from the training. The DAC metadata for the Oracle 11.5.10 source systems has already been imported into the DAC repository in your training environment. For more information on
Oracle BI Applications 7.9: Implementation for Oracle EBS 19

Lesson 4: Installing Oracle BI Applications

this process, please refer to the section on importing the DAC Metadata in the Oracle Business Intelligence Applications Fusion Edition Installation and Configuration Guide. 12. Create the Oracle Business Analytics Warehouse tables. Before starting this procedure, you must create a database schema with a role named SSE_ROLE for the data warehouse. A database schema named OBAW with the role SSE_ROLE has already been created for you in this training environment. The Oracle Business Analytics Warehouse tables are created by the DAC client. The DAC client uses ODBC connections to the Oracle Business Analytics Warehouse database for this procedure. An ODBC connection named OBAW has already been created for you in this training environment. a. In the DAC client, select Tools > ETL Management > Configure. b. In the Sources dialog box, select Oracle as the database platform for both the target data warehouse and source transactional database. c. Click OK to display the Data Warehouse Configuration Wizard. d. Select the Create Data Warehouse Tables check box. e. Click Next. The Data Warehouse tab becomes active. f. Enter the appropriate details of the database schema in which you want to store the data warehouse as specified in the following table. Leave the Container field blank. If you leave the Container filed blank, DAC creates a container by default for the source business applications (Oracle 11.5.10, Siebel 8.0, and so on) that you selected when you imported the seed data into the DAC metadata repository.
Container Table Owner Password ODBC Data Source Data Area Index Area leave empty for all containers obaw obaw obaw obaw_data obaw_index

g. Click Start. The Run Status tab displays information about the process. h. After a minute or two, verify that a Success message is displayed, indicating that the data warehouse tables have been created. If you want to see log information about the process, use the following log files: \OracleBI\DAC\config\generate_ctl.log A log of the schema definition process, including details of any conflicts between containers. \OracleBI\DAC\config\createtables.log A log of the ddlimp process. If a Failure message is displayed, the data warehouse tables have not been created. Use the log information in \OracleBI\DAC\config\generate_ctl.log to diagnose the error. The createtables.log is not generated. Ask your instructor if you need assistance. i. Click Finish. 13. Configure the connection between the DAC server and the DAC repository. Because the training environment uses Windows, you can use the DAC client to configure a DAC server that runs in the same \DAC folder. a. In the DAC client, select Tools > DAC Server Management > DAC Server Setup. The DAC repository that you connect to using the DAC client is the one that stores the DAC server repository connection information that you specify in this procedure.
20 Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 4: Installing Oracle BI Applications

b. Click Yes in the Server Setup message to display the Server Configuration dialog box. c. In the Repository Connection Information tab, enter the appropriate information. Because the DAC server is running on the same machine as the DAC client, click Populate from preconfigured client connection to populate the fields with connection details from the DAC client. d. Click OK and verify the server configuration information.
Connection type Instance Database Host Database Port Table owner name Password Oracle (Thin) orcl <machine name> 1521 dac dac

e. Click Test Connection to make sure the DAC repository connection works. You should receive a message Connection was successfully established! f. Click Close. g. Click Save. (For this training, skip the Email Configuration.) 14. Set DAC system properties. You need to set these properties to ensure proper integration between the DAC client, the DAC server, and Informatica. a. In the DAC client, click the Setup button to display the Setup view. b. Display the DAC System Properties tab. c. Set values for the following properties:
InformaticaParameterFileLocation C:\Informatica\PowerCenter8.1.1\server\infa_shared\ SrcFiles Main Informatica Repository Repository Name DAC Server Host Oracle_BI_DW_Base DAC <machine name>

d. Click Save. 15. Register the Informatica Services in the DAC. a. Click the Informatica Servers tab. b. Register the Informatica Integration Services service, by modifying the record with Name = Oracle_BI_DW_Server. Select Oracle_BI_DW_Server in the top pane and then modify the fields in the Edit subtab. Accept all defaults except for the following:
Server Hostname Password <machine name> Administrator

c. Click Save. d. Register the Informatica repository service, by modifying the record with Name = INFORMATICA_REP_SERVER. Accept all defaults except for the following:
Server Hostname Password <machine name> Administrator
Oracle BI Applications 7.9: Implementation for Oracle EBS 21

Lesson 4: Installing Oracle BI Applications

e. Click Save. 16. Start the DAC server, restart the DAC client, and test the Informatica connections. a. Close the DAC client. b. Restart the Informatica Services 8.1.1 SP4 service. c. Start the DAC server by selecting Start > Programs > Oracle Business Intelligence > Oracle DAC > Start DAC Server. The command window flashes. d. Restart the DAC client by selecting Start the DAC server by selecting Start > Programs > Oracle Business Intelligence > Oracle DAC > DAC Client. e. Log in to the DAC connection as dac with the password dac. f. Confirm that the DAC client is connected to the DAC server by looking at the DAC server monitor icon in the upper-right corner of the DAC client. The icon color should be orange, indicating that the DAC client is connected to the DAC server and the DAC server is idle. When you mouse over the icon, you should see the message DAC Server is idle. It may take a moment for the connection to be established. Wait until the icon turns orange before proceeding to the next step. g. Click Setup to open the Setup view. h. Click the Informatica Servers tab. i. Select Oracle_BI_DW_Server in the upper pane. j. Click Test Connection in the lower pane to test the connection. You should receive a message Connection to Oracle_BI_DW_Server successfully established! k. Click OK. l. Select INFORMATICA_REP_SERVER. m. Click Test Connection to test the connection. You should receive a message Connection to INFORMATICA_REP_SERVER successfully established! n. Click OK. 17. Set the transactional and data warehouse physical data sources in the DAC. a. Click the Physical Data Sources tab. The Physical Data Sources tab displays a pre-created record for the data warehouse with name DataWarehouse and one or more records for the OLTP sources. The records that are created by DAC for the OLTP sources depend on the business application source systems you selected when importing the DAC metadata. In this training environment you should see ORA_11_5_10 and SEBL_80 sources. b. Select DataWarehouse in the top pane. c. Select the Edit subtab and accept all defaults except for the following:
Instance Table Owner Table Owner Password DB Host Port Default Index Space orcl obaw obaw <machine name> 1521 OBAW_INDEX

d. Test the connection. You should receive the message Connection to DataWarehouse successfully established! e. Click OK.
22 Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 4: Installing Oracle BI Applications

f. Click Save. g. Select ORA_11_5_10. h. Select the Edit subtab and accept all defaults except for the following:
Connection type Instance Table Owner Table Owner Password DB Host Port Oracle (Thin) orcl biapps biapps <machine name> 1521

i. Test the connection. You should receive the message Connection to ORA_11_5_10 successfully established! j. Click OK. k. Click Save. l. Leave the DAC client open for the next practice. 18. Configure the relational connections in the Informatica Workflow Manager. a. Verify that the Informatica Services 8.1.1 SP4 service is started. b. To log in to Workflow Manager, select Start > Programs > Informatica PowerCenter 8.1.1 > Client > PowerCenter Workflow Manager. c. In the Welcome window, deselect Show this message at startup and click OK. d. Select Repositories in the Repository Navigator. e. In the menu, select Repository > Add to display the Add Repository dialog box. f. In the Add Repository dialog box, enter Oracle_BI_DW_Base in the Repository field, and Administrator in the Username field. g. Click OK to save the details. h. Select Repository > Connect to display the Connect to Repository dialog box. i. Enter Administrator as the password. j. If the Connection Settings area is not displayed, click More. k. Use the Domain list to select the domain (Domain_<machine name>). Please note: If no domain is visible in the list, click Add to display the Add Domain dialog box. At the Add Domain dialog box, specify the name of the domain that was created when you installed Informatica PowerCenter Services (for example, Domain_<machine name>), and the fully qualified host name for the gateway host (for example, <machine name>) and port for the gateway port (for example, 6001).

Oracle BI Applications 7.9: Implementation for Oracle EBS

23

Lesson 4: Installing Oracle BI Applications

l. Click Connect. The Repository Navigator in the left pane should look similar to the screenshot.

m. Select Connections > Relational to display the Relational Connection Browser. You need to create a connection for each transactional (OLTP) database, and a connection for the Oracle Business Analytics Warehouse (OLAP) database. n. Click New to display the Select Subtype dialog box, select Oracle, and click OK. o. Enter the details for the DataWarehouse connection object as specified in the following table. You must specify DataWarehouse exactly as it appears in the Physical Data Sources tab in the DAC Setup view.
Name User Name Password Connect String DataWarehouse obaw obaw orcl

p. Click OK. q. Click New again and repeat the steps to create the ORA_11_5_10 connection object. Again, you must specify the OLTP connection ORA_11_5_10 exactly as it appears in the Physical Data Sources tab in the DAC Setup view.
Name User Name Password
24

ORA_11_5_10 biapps biapps


Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 4: Installing Oracle BI Applications

Connect String

orcl

r. Click OK. s. Click Close. t. Select Repository > Exit to close Workflow Manager.

Oracle BI Applications 7.9: Implementation for Oracle EBS

25

Lesson 4: Installing Oracle BI Applications

26

Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 5: Understanding the ETL Process

Practice 5-1: Exploring Oracle BI ETL Metadata


Goals To explore some of the prebuilt Oracle Business Intelligence ETL metadata to gain a highlevel understanding of some of the key elements, processes, and naming conventions You have installed and configured the software components required to run ETL for the Oracle Business Analytics Warehouse. Before loading and customizing the data warehouse, you explore some of the prebuilt metadata used to perform ETL. You will be able to relate the fundamental steps of the Oracle Business Analytics Warehouse ETL process to prebuilt metadata in the Oracle Informatica Repository and the DAC repository. 2030 minutes

Scenario

Outcome

Time

Instructions:
In this exercise, you use Informatica tools and the DAC client to explore some of the prebuilt metadata used in the Oracle Business Analytics Warehouse ETL process. At this point, do not concern yourself with specific details about the metadata, the Informatica tools, or the DAC client, which are all covered in more detail in subsequent lessons. Rather, focus on the metadata objects and their general use in the ETL process. 1. Start the Informatica Designer tool and connect to the Informatica Repository. a. Verify that the service Informatica Services 8.1.1 SP4 is started. b. Select Start > Programs > Informatica PowerCenter 8.1.1 > Client > PowerCenter Designer. c. If the Welcome dialog box appears, deselect Show this message at startup and click OK. d. Double-click Oracle_BI_DW_Base. e. Enter Administrator as the username and password and click Connect. 2. Explore an SDE adaptor folder. a. Expand the SDE_ORA11510_Adaptor folder in the Repository Navigator. This folder contains the Informatica repository source dependent extract (SDE) metadata for the Oracle E-Business Suite application, version 11.5.10.

Oracle BI Applications 7.9: Implementation for Oracle EBS

27

Lesson 5: Understanding the ETL Process

b. Expand the Sources folder. c. Expand the OLTP and OLAP subfolders. These subfolders contain the Informatica repository source definition objects for the transactional (OLTP) and data warehouse (OLAP) databases for this adaptor. Source definition objects provide detailed descriptions of tables or files that provide source data in mappings. (For this training, you can ignore the other subfolders in the Sources folder.) d. Expand the Targets folder. The Targets folder contains the Informatica repository target definition objects for this adaptor. Target definition objects provide detailed descriptions of objects or files that contain target data in mappings. e. Expand the Transformations folder. The Transformations folder contains the Informatica repository transformation objects for this adaptor. Transformation objects are used in mappings to generate, modify, or pass data. f. Expand the Mapplets folder. The Mapplets folder contains the Informatica repository mapplet objects for this adaptor. Mapplets are reusable objects that are used in mappings in the same way as transformations to generate, modify, or pass data. g. Expand the Mappings folder. The Mappings folder contains the Informatica repository mapping objects for this adaptor. Mappings contain a set of mapplets, source definitions, target definitions, and transformations used to extract, transform, and load data. 3. Explore an SDE mapping. a. Expand the Mappings folder and locate the mapping SDE_ORA_GLRevenueFact. Based on the name of this mapping, what do you surmise is the purpose of this mapping?

b. Expand the mapping SDE_ORA_GLRevenueFact. What is the target table for this mapping?

c. Is this table located in the transactional database or the data warehouse?

28

Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 5: Understanding the ETL Process

d. What type of table is this?

e. Notice that there are no source instances for this mapping. Non-Siebel SDE adapter folders typically do not expose sources directly in the mappings. Instead, they use the concept of Business Component mapplets. These are extract mapplets that may contain relational, application, or flat file sources. You learn more about this in the next lesson, Working with Informatica Designer, and Lesson 8, Exploring SDE and SIL Mappings. f. Expand the Transformation Instances folder and notice that there are two mapplets defined for this mapping: mplt_BC_ORA_GLRevenueFact and mplt_SA_ORA_GLRevenueFact. These mapplets are transformation objects in the SDE mapping. They are used to extract data from an Oracle source, make any necessary transformations to the data, and output the data to the next transformation object in the SDE mapping. You learn more about this in the next lesson, Working with Informatica Designer, and Lesson 8, Exploring SDE and SIL Mappings. 4. Explore SDE mapplets. a. Expand SDE_ORA11510_Adaptor > Mapplets to examine the two mapplets for the SDE_ORA_GLRevenueFact mapping. b. Scroll to locate the mapplet mplt_BC_ORA_GLRevenueFact. c. Expand the Source Instances folder and notice there are four source definitions for this mapplet: RA_CUSTOMER_TRX_ALL, RA_CUSTOMER_TRX_LINES_ALL, RA_CUST_TRX_LINE_GL_DIST_ALL, and RA_SALESREPS_ALL. Are these sources located in the transactional database or the data warehouse database?

d. Expand the Transformation Instances folder. This folder contains the additional transformation objects for this mapplet. e. Locate and expand the mapplet mplt_SA_ORA_GLRevenueFact. f. Notice there are no source instances for this mapplet. That is because this mapplet contains lookup transformations, which can look up data in a flat file or a relational table, view, or synonym. g. Expand the Transformation Instances folder. Notice that there are two lookup transformations for this mapplet: LKP_CUSTLOC_CUST_LOC_ID and LKP_LOC_CURRENCY. 5. Explore the SILOS folder. a. Expand Oracle_BI_DW_Base > SILOS. This folder contains the Informatica Repository source independent load (SIL) metadata. Notice that the SILOS folder contains all of the

Oracle BI Applications 7.9: Implementation for Oracle EBS

29

Lesson 5: Understanding the ETL Process

same subfolders contained in an SDE_*_Adapter folder.

b. Expand the Sources folder. Are the sources listed here in the transactional database or the data warehouse?

6. Explore an SIL mapping. a. Expand the Mappings folder and locate the mapping SIL_GLRevenueFact. Based on the name of this mapping, what do you surmise is the purpose of this mapping?

b. Expand SIL_GLRevenueFact > Target Instances. What is the target table for this mapping?

c. Is this table located in the transactional database or the data warehouse?

d. What type of table is this?

e. Expand SIL_GLRevenueFact > Source Instances. Notice that one of the sources for this mapping is W_GL_REVN_FS. Is this table located in the transactional database or the data warehouse?

f. What type of table is this?

g. Which SDE mapping populates this table?

h. Expand the Transformation Instances folder and notice that there are many more transformation objects associated with an SIL mapping than with an SDE mapping. You explore these transformations in more detail in Lesson 8: Exploring SDE and SIL Mappings. i. Can you think of any instances where a Source Independent Loading mapping would include a source table in the transactional database?
30 Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 5: Understanding the ETL Process

7. Explore workflows in Informatica Workflow Manager. a. In Informatica PowerCenter Designer, select Tools > Workflow Manager. b. In the Repository Navigator, expand SDE_ORA11510_Adapter > Workflows > SDE_ORA_GLRevenueFact to view the components of the SDE_ORA_GLRevenueFact workflow. Notice there are two tasks for this workflow: a Start task and a task named SDE_ORA_GLRevenueFact. c. Right-click the SDE_ORA_GLRevenueFact task (not the workflow) and select Properties. d. If necessary, click the Object tab. e. Notice that the mapping for this task is SDE_ORA_GLRevenueFact. This workflow contains instructions on how to execute the task for the mapping SDE_ORA_GLRevenueFact, which extracts and moves data from source tables to the W_GL_REVN_FS fact staging table.

f. Click Cancel to close the Properties dialog box. g. Notice that there is another workflow named SDE_ORA_GLRevenueFact_Full. This workflow also contains instructions on how to execute the task for the SDE_ORA_GLRevenueFact mapping, which extracts and moves data from source tables to the W_GL_REVN_FS fact staging table. A table can be loaded in full mode or incremental mode. Full mode refers to data loaded for the first time or data that is truncated and then loaded. Incremental mode refers to new or changed data being added to the existing data.
Oracle BI Applications 7.9: Implementation for Oracle EBS 31

Lesson 5: Understanding the ETL Process

h. In the Repository Navigator, expand SILOS > Workflows > SIL_GLRevenueFact to view the components of the SIL_GLRevenueFact workflow. Notice that there are two tasks for this workflow: a Start task and a task named SIL_GLRevenueFact. i. Right-click the SIL_GLRevenueFact task (not the workflow) and select Properties. j. If necessary, click the Object tab. k. Notice that the mapping for this task is SIL_GLRevenueFact. This workflow contains instructions on how to execute the task for the SIL_GLRevenueFact mapping, which loads data from the fact staging table and other data warehouse tables into the W_GL_REVN_F fact table.

l. m. n. o.

Click Cancel to close the Properties dialog box. Notice that there is another workflow named SIL_GLRevenueFact_Full. Select Repository > Exit to close Workflow Manager. Select Repository > Exit to close Designer.

8. Return to the DAC client, which should still be open from the previous set of practices. If not, follow the instructions below to start the DAC client. a. Select Start > Programs > Oracle Business Intelligence > Oracle DAC > DAC Client. b. Log in to the DAC connection with table owner name dac and password dac (these should be the default login values). It may take a few seconds for the DAC client to open. 9. Explore DAC SDE tasks. a. If necessary, click Design in the upper-left corner of the DAC client to open the Design view. b. If necessary, select the Oracle 11.5.10 container in the list. c. Click the Tasks tab in the upper-right corner (not the Tasks subtab).
32 Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 5: Understanding the ETL Process

d. Click the Query button. e. In the Name field, enter SDE_ORA_GLRevenueFact. f. Click Go and verify that the SDE_ORA_GLRevenueFact task is returned by the query and is the only task visible in the upper window. g. If necessary, click the Edit subtab. h. Notice that the task phase for this task is Extract Fact, which uses SDE workflows and mappings to extract data from source transactional tables and to load fact staging tables in the Oracle Business Analytics Warehouse (OBAW). i. Notice that the execution type for this task is Informatica. Most DAC tasks are Informatica task types, which call and run Informatica workflows in the Informatica repository. Based on your exploration of workflows in Informatica Workflow Manager, which Informatica workflows are called and run by this DAC task?

j. Which Informatica mapping is executed by the workflows that are called and run by this DAC task?

k. Based on your exploration of this SDE_ORA_GLRevenueFact mapping in Informatica Designer, what is the target table for this DAC task?

l. To verify your answer, click the Target Tables subtab. m. Click the Source Tables subtab to view the source tables for this DAC task. 10. Explore DAC SIL tasks. a. Click the Query button. b. In the Name field, enter SIL_GLRevenueFact. c. Click Go and verify that the SIL_GLRevenueFact task is returned by the query and is the only task visible in the upper window. Recall that this is the SIL mapping you explored in Informatica Designer earlier in this practice. d. If necessary, click the Edit subtab. e. Notice that the task phase for this task is Load Fact, which uses SIL workflows and mappings to load data from fact staging tables into fact tables in the OBAW. f. Notice that the execution type for this task is Informatica. Most DAC tasks are Informatica task types, which call and run Informatica workflows in the Informatica repository. Based on your exploration of workflows in Informatica Workflow Manager, which Informatica workflows are called and run by this DAC task?

g. Which Informatica mapping is executed by the workflows that are called and run by this DAC task?

Oracle BI Applications 7.9: Implementation for Oracle EBS

33

Lesson 5: Understanding the ETL Process

h. Based on your exploration of this SIL_GLRevenueFact mapping in Informatica Designer, what is the target table for this DAC task?

i. To verify your answer, click the Target Tables subtab. j. Click the Source Tables subtab to view the source tables for the SIL_GLRevenueFact DAC task. k. What is the primary source table for the SIL_GLRevenueFact task?

l. What type of table is this?

m. Which DAC task loads this table?

n. Select File > Close to close the DAC client.

34

Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 5: Understanding the ETL Process

Solutions 5-1: Exploring Oracle BI ETL Metadata


Answers:
3.a. Expand the Mappings folder and locate the mapping SDE_ORA_GLRevenueFact. Based on the name of this mapping, what do you surmise is the purpose of this mapping? This source dependent extract (SDE) mapping extracts Oracle source data that is used to load the general ledger revenue fact table. Expand the mapping SDE_ORA_GLRevenueFact. What is the target table for this mapping? W_GL_REVN_FS Is this table located in the transactional database or the data warehouse? Data warehouse What type of table is this? Fact staging table Expand the Source Instances folder and notice there are four source definitions for this mapplet: RA_CUSTOMER_TRX_ALL, RA_CUSTOMER_TRX_LINES_ALL, RA_CUST_TRX_LINE_GL_DIST_ALL, and RA_SALESREPS_ALL. Are these sources located in the transactional database or the data warehouse database? Transactional database Expand the Sources folder. Are the sources listed here in the transactional database or the data warehouse? Data warehouse Expand the Mappings folder and locate the mapping SIL_GLRevenueFact. Based on the name of this mapping, what do you surmise is the purpose of this mapping? The purpose of this mapping is to load the general ledger revenue fact table in the data warehouse. Expand SIL_GLRevenueFact > Target Instances. What is the target table for this mapping? W_GL_REVN_F Is this table located in the transactional database or the data warehouse? Data warehouse What type of table is this? Fact table Expand SIL_GLRevenueFact > Source Instances. Notice that one of the sources for this mapping is W_GL_REVN_FS. Is this table located in the transactional database or the
Oracle BI Applications 7.9: Implementation for Oracle EBS 35

3.b.

3.c. 3.d. 4.c.

5.b.

6.a.

6.b.

6.c. 6.d. 6.e.

Lesson 5: Understanding the ETL Process

data warehouse? Data Warehouse 6.f. 6.g. 6.i. What type of table is this? Fact staging table Which SDE mapping populates this table? SDE_ORA_GLRevenueFact Can you think of any instances where a Source Independent Loading mapping would include a source table in the transactional database? Source Independent Loading mappings are used to process data in the staging tables and other data warehouse tables and load it into the ultimate target tables, so typically these mappings would not include a source table in transactional database. Notice that the execution type for this task is Informatica. Most DAC tasks are Informatica task types, which call and run Informatica workflows in the Informatica repository. Based on your exploration of workflows in Informatica Workflow Manager, which Informatica workflows are called and run by this DAC task? SDE_ORA_GLRevenueFact and SDE_ORA_GLRevenueFact_Full Which Informatica mapping is executed by the workflows that are called and run by this DAC task? SDE_ORA_GLRevenueFact Based on your exploration of this SDE_ORA_GLRevenueFact mapping in Informatica Designer, what is the target table for this DAC task? W_GL_REVN_FS Notice that the execution type for this task is Informatica. Most DAC tasks are Informatica task types, which call and run Informatica workflows in the Informatica repository. Based on your exploration of workflows in the Informatica Workflow Manager, which Informatica workflows are called and run by this DAC task? SIL_GLRevenueFact and SIL_GLRevenueFact_Full Which Informatica mapping is executed by the workflows that are called and run by this DAC task? SIL_GLRevenueFact Based on your exploration of this SIL_GLRevenueFact mapping in Informatica Designer, what is the target table for this DAC task? W_GL_REVN_F What is the primary source table for the SIL_GLRevenueFact task? W_GL_REVN_FS What type of table is this? Fact staging table

9.i.

9.j.

9.k.

10.f.

10.g.

10.h.

10.k. 10.l.

36

Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 5: Understanding the ETL Process

10.m.

Which DAC task loads this table? SDE_ORA_GLRevenueFact

Oracle BI Applications 7.9: Implementation for Oracle EBS

37

Lesson 5: Understanding the ETL Process

38

Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 6: Working with Informatica Designer

Practice 6-1: Working with Informatica Designer


Goal Scenario To use Informatica Designer tools to build a source dependent extract mapping The primary goal of this practice is to become familiar with Informatica Designer and its tools. To accomplish this, you create a source dependent extract (SDE) mapping that extracts data from a source table and moves the data into a fact staging table. Thus, the secondary goal of this practice is to become familiar with some ETL mapping components and the steps to create them. In this set of practices, you use custom tables provided specifically for this training. These tables have very small datasets, so you can focus less on the data being moved by ETL and more on Informatica Designer tools and the steps for building mappings. In the practices for the next lesson, you use Informatica Workflow Manager to run the mapping and to verify the results. Later in this course, in lessons 14 and 15, you build custom SDE and SIL mappings and use the DAC to run the mappings and to verify the results. Outcome When you complete this practice, you will have an SDE mapping that extracts data from the REVN table and loads it into the WC_REVN_FS staging table. 3040 minutes

Time

Instructions:
1. Open Informatica Repository Manager and create a custom Informatica repository folder. All modifications to the Oracle_BI_DW_Base repository should be done in custom folders. a. Select Start > Programs > Informatica PowerCenter 8.1.1 > Client > PowerCenter Repository Manager. b. If the Welcome dialog box appears, deselect Show this message at startup and click OK. c. Double-click Oracle_BI_DW_Base. d. Enter Administrator as the username and password and click Connect. e. Select Folder > Create. f. Name the new folder CUSTOM_SDE and click OK. g. Click OK to confirm that the folder has been successfully created. 2. Import the source for the SDE mapping. a. Select Tools > Designer to open the Informatica PowerCenter Designer. b. Right-click the CUSTOM_SDE folder and select Open. Alternatively, you can double-click the CUSTOM_SDE folder to open it. c. Verify that the CUSTOM_SDE folder is bolded and that the toolbar displays CUSTOM_SDE (Oracle_BI_DW_Base) in the drop-down field in the upper-left corner. d. Source Analyzer should already be open in the workspace. If not, select Tools > Source Analyzer.
Oracle BI Applications 7.9: Implementation for Oracle EBS 39

Lesson 6: Working with Informatica Designer

e. Select Sources > Import from Database. The Import Tables dialog box appears. f. Select ETL_LAB_OLTP (Oracle in OraDb11g_home1) from the ODBC data source list. This ODBC data source has already been created for your training environment. g. Enter etl_lab_oltp for username, owner name, and password, and click Connect. Use lowercase for all three values. h. Expand ETL_LAB_OLTP > TABLES. i. Select the REVN table. j. Click OK. REVN is displayed in the Source Analyzer workspace. k. In the Navigator, expand CUSTOM_SDE > Sources and verify that the ETL_LAB_OLTP source appears and contains the REVN table. l. Select Repository > Save. m. In the Output window, verify there is a message indicating that the source ETL_LAB_OLTP:REVN is inserted. 3. Preview the source data. a. In the Source Analyzer window, right-click the REVN table and select Preview Data. b. Select ETL_LAB_OLTP (Oracle in OraDb11g_home1) for the ODBC data source, enter etl_lab_oltp for the username, owner name, and password, and click Connect. Use lowercase for all three values. The Preview Data window appears.

Note that REVN has five rows of revenue data with a primary key (ROW_ID), foreign keys for product and person data, and a LAST_UPDATE_DATE column. Recall that LAST_UPDATE_DATE is used in incremental loads to identify changed records. LAST_UPDATE_DATE is compared to the DAC parameter $$LAST_EXTRACT_DATE to determine which records have been updated since the last ETL run. c. Click Close to close the Preview Data window. 4. Import the target for the SDE mapping. a. Select Tools > Target Designer. b. Select Targets > Import from Database. c. Select the ETL_LAB_DW (Oracle in OraDb11g_home1) ODBC data source to connect to the target database.
40 Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 6: Working with Informatica Designer

d. Enter etl_lab_dw for the username, owner name, and password, and click Connect. Use lowercase for all three values. e. Expand ETL_LAB_DW > TABLES. f. Select the WC_RVF_FS table. g. Click OK. WC_RVF_FS is displayed in the Target Designer workspace. h. In the Navigator, expand CUSTOM_SDE > Targets and verify that the WC_RVF_FS target is visible. i. Select Repository > Save. j. In the Output window, verify there is a message indicating that the target WC_RVF_FS is inserted. k. In the Target Designer window, examine the WC_RVF_FS table definition. Notice that this staging table has the following required columns: - INTEGRATION_ID: Stores the primary key or the unique identifier of a record as in the source table. - DATASOURCE_NUM_ID: Stores the data source from which the data is extracted. It also has placeholders for the source columns that will be mapped to this target when you build the SDE mapping. 5. In the steps that follow, you create a source dependent extract mapping named SDE_C_RevenueFact. This mapping extracts and moves data from the source table, REVN, to the target fact staging table, WC_RVF_FS. The data does not undergo any major transformations in this mapping as it is moved from the source to the target fact staging table. The C in the naming convention indicates that this is a custom mapping. The completed mapping will look similar to this screenshot:

At all times, ensure that you are working in the CUSTOM_SDE Folder. Verify that the CUSTOM_SDE folder is bolded and the Designer toolbar displays CUSTOM_SDE (Oracle_BI_DW_Base) in the drop-down field in the upper-left corner. 6. Create a mapplet to extract the revenue data from the source. a. Select Tools > Mapplet Designer. b. Select Mapplets > Create. c. Name the mapplet mplt_BC_C_RevenueFact and click OK. 7. Create parameters to pass values for the last extract date and the initial extract date to the mapplet. These parameters are used to identify records that have changed since the last ETL run. The values for these parameters are passed from the DAC during ETL run time. Because you will use Informatica Workflow Manager, and not the DAC, to run and verify the SDE mapping in the next practice, you hard code the value for the $$LAST_EXTRACT_DATE parameter. Please note that hard-coding the value is for training purposes only and is not the recommended practice.
Oracle BI Applications 7.9: Implementation for Oracle EBS 41

Lesson 6: Working with Informatica Designer

a. Select Mapplets > Parameters and Variables. b. Click the Add a new variable to this table button. c. Enter the following values:
Name: Type: Data type: Initial Value $$LAST_EXTRACT_DATE Parameter Date/time 05/02/2003 21:02:44

d. Click the Add a new variable to this table button again. e. Enter the following values:
Name: Type: Data type: Initial Value $$INITIAL_EXTRACT_DATE Parameter Date/time <leave blank>

f. Click OK. 8. Add a source definition to the mapplet. a. In the Repository Navigator, expand CUSTOM_SDE > Sources > ETL_LAB_OLTP. b. Drag the REVN source into the Mapplet Designer. By default, a source qualifier transformation named SQ_REVN is created. c. If desired, select Layout > Zoom Percent to change the layout of the mapplet in the Mapplet Designer. 9. Add a mapplet output transformation to the mapplet. a. Select Transformation > Create to open the Create Transformation dialog box. b. In the list, select Mapplet Output. c. Name the transformation MAPO_REVN_EXTRACT. d. Click Create. e. Click Done. f. Drag MAPO_REVN_EXTRACT to the right of SQ_REVN. g. Drag each column from SQ_REVN to a blank row in MAPO_REVN_EXTRACT. h. Check your work. Your mapplet should look similar to the screenshot:

10. Generate the default SQL query for the source qualifier. a. Double-click the SQ_REVN source qualifier. b. Click the Properties tab.
42 Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 6: Working with Informatica Designer

c. For the Sql Query transformation attribute, in the Value field, click the down arrow to open the SQL Editor. d. Click inside the SQL: field. e. Click Generate SQL to display the default query. f. Modify the default SQL query to include a WHERE clause that compares the LAST_UPDATE_DATE column to the $$LAST_EXTRACT_DATE parameter. Hint: You can enter the port and variable manually, or double-click the corresponding objects in the left pane to add them to the query:
WHERE (REVN.LAST_UPDATE_DATE > TO_DATE('$$LAST_EXTRACT_DATE', 'MM/DD/YYYY HH24:MI:SS'))

g. Click OK to close SQL Editor. h. Click Apply in the Edit Transformations dialog box. i. Click OK to close the Edit Transformations dialog box. 11. Validate the mapplet. a. Select Layout > Arrange. b. In the Select Outputs dialog box, verify that MAPO_REVN_EXTRACT is selected and click OK. c. If necessary, select View > Output to view the Output window. d. Click the Validate tab of the Output window e. Select Mapplets > Validate. You should receive the message Mapplet mplt_BC_C_RevenueFact is VALID. f. If your mapplet is valid, select Repository > Save to update the repository. If you receive a message that the mapplet is not valid, review the steps in this practice and try to troubleshoot. If you need assistance, ask your instructor. 12. Create an SDE mapping. a. Select Tools > Mapping Designer. b. Select Mappings > Create. c. Name the mapping SDE_C_RevenueFact, and click OK. 13. Create a parameter to pass a value for the DATASOURCE_NUM_ID to the mapping. DATASOURCE_NUM_ID is a unique number assigned by the DAC to a data source so that the data can be identified in the data warehouse. By default, the unique number assigned to the Oracle E-Business Suite source is 4. This value is passed from the DAC during ETL run time. Because you will use Informatica Workflow Manager and not the DAC to run and verify the SDE mapping in the next practice, you hard code the value for the $$DATASOURCE_NUM_ID
Oracle BI Applications 7.9: Implementation for Oracle EBS 43

Lesson 6: Working with Informatica Designer

parameter. Please note that hard-coding this value is for training purposes only and is not the recommended practice. a. Select Mappings > Parameters and Variables. b. Click the Add a new variable to this table button. c. Enter the following values:
Name: Type: Data type: Initial Value $$DATASOURCE_NUM_ID Parameter integer 4

d. Click OK. 14. Add the mapplet to the mapping. a. In the Repository Navigator, expand CUSTOM_SDE > Mapplets. b. Drag mplt_BC_C_RevenueFact from the Navigator into the Mapping Designer workspace. 15. Create a new Expression transformation and add it to the mapping. An Expression transformation is used to calculate values in a single row before writing to the target. In this step, you use the $$DATASOURCE_NUM_ID parameter that you created in an earlier step to place a value of 4 in DATASOURCE_NUM_ID. a. Select Transformation > Create. b. Select Expression from the transformation type list. c. Enter EXPTRANS as the transformation name. d. Click Create. e. Click Done. f. In mplt_BC_C_RevenueFact, use Shift-click to select all five ports, ROW_ID, PROD_ID, PERSON_ID, REVF, and LAST_UPDATE_DATE, and drag them to the EXPTRANS transformation. g. Double-click EXPTRANS. h. Click the Ports tab. i. Click the Add a new port to this transformation button. j. Give the new port the following attributes:
Name: Data type: Prec Port Type DATASOURCE_NUM_ID Decimal 10 Output

k. If necessary, select the DATASOURCE_NUM_ID port and use the up arrow to move it to the top of the port list. l. For the DATASOURCE_NUM_ID port, click the down arrow button in the Expression column to open Expression Builder. m. Delete the existing formula (DATASOURCE_NUM_ID). n. Click the Variables tab in the left pane.

44

Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 6: Working with Informatica Designer

o. Expand the Mapping Parameters folder and double-click the $$DATASOURCE_NUM_ID variable to add it to the expression. When you run ETL in the next practice, the DATASOURCE_NUM_ID port will be populated with the initial value (4) defined for the $$DATASOURCE_NUM_ID variable. p. Click OK. q. Click Apply and OK. 16. Add and link the target definition. Target definitions vary by mapping. In this practice, the target for data extracted from the REVN source is the WC_RVF_FS fact staging table in the ETL_LAB_DW database. a. In the Repository Navigator, expand CUSTOM_SDE > Targets and select WC_RVF_FS. b. Drag WC_RVF_FS into the Mapping Designer and place it to the right of EXPTRANS. c. Drag ROW_ID from EXPTRANS to INTEGRATION_ID in the WC_RVF_FS target definition. INTEGRATION_ID stores the primary key or the unique identifier of a record. d. Drag DATASOURCE_NUM_ID, PROD_ID, PERSON_ID, REVF, and LAST_UPDATE_DATE from EXPTRANS onto their corresponding ports in the WC_RVF_FS target definition. 17. Validate the mapping. a. Select Layout > Arrange. b. Select WC_RVF_FS as the target and click OK. The mapping should look similar to the screenshot:

c. If necessary, select View > Output to view the Output window. d. Click the Validate tab of the Output window e. Select Mappings > Validate. You should receive the message Mapping SDE_C_RevenueFact is VALID. f. If your mapping is valid, select Repository > Save. If you receive a message that the mapping is not valid, review the steps in this practice and try to troubleshoot. If you need assistance, ask your instructor. g. Leave Informatica Designer open for the next practice.

Oracle BI Applications 7.9: Implementation for Oracle EBS

45

Lesson 6: Working with Informatica Designer

46

Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 7: Working with Informatica Workflow Manager

Practice 7-1: Creating and Running an Informatica Workflow


Goals To use Informatica Workflow Manager to create a workflow for the SDE mapping you created in the previous practice In the practice for the previous lesson you created a source dependent extract (SDE) mapping that extracts data from a source table and moves the data into a fact staging table. Now you use Informatica Workflow Manager to create a workflow to run the mapping. Please note that the DAC is used to run ETL mappings in production environments; whereas Informatica client tools are used for testing. Again, as in the practice for the previous lesson, you use custom tables provided specifically for this training. You have a workflow associated with the SDE mapping and have loaded the WC_RVF_FS fact staging table. 2030 minutes

Scenario

Outcome

Time

Instructions:
1. Verify that the Informatica Services 8.1.1 SP4 service is started. 2. Start the Informatica Workflow Manager. If the Informatica Designer is still open from the previous practice, select Tools > Workflow Manager. If the Informatica Designer is not open, follow these steps: a. Select Start > Programs > Informatica PowerCenter 8.1.1 > Client > PowerCenter Workflow Manager. b. Double-click Oracle_BI_DW_Base. c. Enter Administrator as the username and password and click Connect. 3. Add a database connection to the target database. a. Select Connections > Relational to open the Relational Connection Browser. b. Click New. c. In the Select Subtype dialog box, select Oracle and click OK. d. In the Connection Object Definition dialog box, enter the following values:
Name User Name Password Connect String etl_lab_dw etl_lab_dw etl_lab_dw ORCL

e. Click OK to return to the Relational Connection Browser.


Oracle BI Applications 7.9: Implementation for Oracle EBS 47

Lesson 7: Working with Informatica Workflow Manager

4. Add a database connection to the source database. a. Click New in the Relational Connection Browser. b. In the Select Subtype dialog box, select Oracle and click OK. c. In the Connection Object Definition dialog box, enter the following values:
Name User Name Password Connect String etl_lab_oltp etl_lab_oltp etl_lab_oltp ORCL

d. Click OK to return to the Relational Connection Browser. e. Click Close to close the Relational Connection Browser. 5. Create a workflow for the SDE mapping you created. This workflow contains instructions on how to execute the session task for the mapping SDE_C_RevenueFact, which extracts and moves the data from the source table REVN to the target fact staging table WC_RVF_FS. a. Verify that the CUSTOM_SDE folder is open. If it is not open, double-click CUSTOM_SDE to open it or right-click CUSTOM_SDE and select Open. b. Select Tools > Workflow Designer. c. Select Workflows > Create. d. In the Create Workflow dialog box, enter the name s_SDE_C_RevenueFact. e. Click OK. f. Select Tasks > Create to add a session task to the workflow. g. In the Create Task dialog box, select the Session task type. h. Enter the name s_SDE_C_RevenueFact. i. Click Create. j. In the Mappings dialog box, select the SDE_C_RevenueFact mapping and click OK. k. Click Done. l. In the Workflow Designer window, drag the s_SDE_C_RevenueFact session off of the Start task. m. Select Tasks > Link Task, then click the Start task, and drag the link to the s_SDE_C_RevenueFact session task. 6. Edit the session task properties. a. In the Workflow Designer workspace, double-click the s_SDE_C_RevenueFact session task. b. In the General tab, select Fail parent if this task fails and Fail parent if this task does not run. c. Click the Properties tab. d. For the $Source connection value attribute, click the down arrow in the Value field to open the Connection Browser. e. In the Connection Browser, select ETL_LAB_OLTP and click OK. f. For the $Target connection value attribute, click the down arrow in the Value field to open the Connection Browser.

48

Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 7: Working with Informatica Workflow Manager

g. In the Connection Browser, select ETL_LAB_DW and click OK. Note: As you will see in subsequent practices, in a production environment you would use connection variables generated by the DAC to designate source and target connections. Because you are in a test environment, you are manually selecting the relational connections you created earlier. h. Click the Config Object tab of the Edit Tasks dialog box. i. In the Error handling section, for the Stop on errors attribute, enter a value of 1. j. Click the Mapping tab. k. In the left pane, select the mplt_BC_C_RevenueFact.SQ_REVN source. l. In the Connections setting in the right pane, click the down-arrow button in the Value field to edit the source connection. m. In the Relational Connection Browser, select ETL_LAB_OLTP, and then click OK. n. In the left pane, select the WC_RVF_FS target. o. In the Connections setting in the right pane, click the down-arrow button in the Value field to edit the target connection. p. In the Relational Connection Browser, select ETL_LAB_DW, and then click OK. q. In the Properties section, change the Target load type attribute from Bulk to Normal. r. In the Properties section, scroll down and select Truncate target table option. s. Click Apply. t. Click OK. 7. Validate the workflow. a. Select Layout > Arrange > Horizontal. b. If necessary, select View > Output to view the Output window. c. Click the Validate tab of the Output window. d. Select Workflows > Validate. You should receive the message Workflow s_SDE_C_RevenueFact is VALID. e. If your workflow is valid, select Repository > Save. If you receive a message that the workflow is not valid, review the steps in this practice and try to troubleshoot. If you need assistance, ask your instructor. 8. Before running the workflow, use SQL*Plus to query the WC_RVF_FS target table and confirm that there is no data. WC_RVF_FS is the staging table in the ETL_LAB_DW database that will be populated by running the s_SDE_C_RevenueFact workflow you just created. a. Select Start > Programs > Oracle OraDb11g_home1 > Application Development > SQL Plus. b. Enter ETL_LAB_DW as the username and click Enter. c. Enter ETL_LAB_DW as the password and click Enter. d. At the SQL> prompt, execute the following query:
SELECT COUNT(*) FROM WC_RVF_FS;

e. Verify that the query returns a count of zero. f. Leave SQL*Plus open. 9. Execute the s_SDE_C_RevenueFact workflow to load the data into WC_RVF_FS. a. Return to Informatica Workflow Manager.
Oracle BI Applications 7.9: Implementation for Oracle EBS 49

Lesson 7: Working with Informatica Workflow Manager

b. In the Repository Navigator, expand CUSTOM_SDE > Workflows, right-click s_SDE_C_RevenueFact, and select Start Workflow. Informatica PowerCenter Workflow Monitor should open when you start the workflow. 10. Monitor the progress of the workflow in Workflow Monitor. a. Monitor the progress and verify that the workflow and the session task both have a status of Succeeded before continuing. If the workflow returns a status of Failed, right-click the s_SDE_C_RevenueFact workflow object in Workflow Monitor, select Get Workflow Log and try to troubleshoot. If the session task returns a status of Failed, right-click the s_SDE_C_RevenueFact session task object in Workflow Monitor, select Get Task Log and try to troubleshoot. Troubleshooting may require changes to the workflow in Workflow Manager or changes to the SDE mapping in Designer. If you troubleshoot successfully, try re-running the workflow. If you need assistance, ask your instructor. b. When the both the workflow and session complete successfully, close Workflow Monitor, Workflow Manager, and Designer. 11. Use SQL*Plus to query the WC_RVF_FS table to verify that the data has been loaded. a. Return to SQL*Plus, which should still be open. b. At the SQL> prompt, execute the following query:
SELECT COUNT(*) FROM WC_RVF_FS;

c. Verify that the query returns five rows of data. d. Close SQL*Plus.

50

Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 8: Exploring SDE and SIL Mappings

Practice 8-1 Exploring a Prebuilt SDE Mapping


Goals Scenario To explore a prebuilt source dependent extract (SDE) mapping in the Oracle Informatica repository The main goal of this practice is to explore the anatomy of a typical prebuilt SDE mapping in the Informatica repository. Extract mappings generally consist of a source table or business component, an expression transformation, and a staging table. Business components are packaged as mapplets, which reside in source-specific folders within the repository. Business components are used to extract data from the source system. An understanding of the anatomy of prebuilt mappings will assist you in the configuration and customization of ETL mappings. This practice explores the function and components of a typical SDE mapping. Note that individual mappings may differ substantially from the examples provided here. Time 1015 minutes

Instructions:
1. Open Informatica Repository Manager and create a new Informatica repository folder. a. Select Start > Programs > Informatica PowerCenter 8.1.1 > Client > PowerCenter Repository Manager. b. Double-click Oracle_BI_DW_Base. c. Enter Administrator as the username and password and click Connect. d. Select Folder > Create. e. Name the new folder Explore_SDE and click OK. f. Click OK to confirm that the folder has been successfully created. g. Select Explore_SDE in the Repository Navigator. h. Select Tools > Designer to open Designer. i. Verify that the new Explore_SDE folder is visible in the Repository Navigator in Designer. j. Close Repository Manager. 2. Copy a mapping to the new folder. a. Verify that the Explore_SDE folder is open in Designer. The folder is open if it is bolded and appears as Explore_SDE (Oracle_BI_DW_Base) in the list in the upper-left corner of the Designer toolbar. b. Expand the Explore_SDE folder. Notice that at this point it contains empty subfolders. c. Expand SDE_ORA11510_Adaptor > Mappings. d. Select the mapping SDE_ORA_GLRevenueFact. e. Select Edit > Copy. f. Select the Explore_SDE folder. g. Select Edit > Paste.
Oracle BI Applications 7.9: Implementation for Oracle EBS 51

Lesson 8: Exploring SDE and SIL Mappings

h. Click Yes to confirm that you want to copy SDE_ORA_GLRevenueFact. You are copying this mapping to a custom folder as a precaution to ensure that you do not inadvertently make changes to the mapping in the SDE_ORA11510_Adaptor folder while you explore it. i. Expand Explore_SDE > Mappings and confirm that the SDE_ORA_GLRevenueFact mapping is copied to the Explore_SDE folder.

j. Expand the subfolders under Explore_SDE and notice that all of the related repository objects for this mapping were also copied to the folder: sources, targets, transformations, and mapplets. k. Select Repository > Save. 3. Explore the SDE_ORA_GLRevenue mapping. a. Select Tools > Mapping Designer to open Mapping Designer in the workspace. b. If necessary, expand Explore_SDE > Mappings. c. Drag SDE_ORA_GLRevenueFact into Mapping Designer. This mapping extracts revenue data from tables in the Oracle E-Business Suite source system and stores the data in the W_GL_REVN_FS fact staging table in the Oracle Business Analytics Warehouse. d. If desired, select Layout > Zoom Percent and select a percent to modify the mapping layout in the Mapping Designer workspace. e. Note that there are four components in the mapping and that the data flow is from left to right. The following table provides a high-level explanation of the function of each component in the mapping. You explore each component in detail later in this practice.
mplt_BC_ORA_GLRevenueFact Mapplet Exp_W_GL_REVN_FS_Integration_Id Expression mplt_SA_ORA_GLRevenueFact Mapplet Extracts revenue transactions from the source system Calculates values in a single row before writing to the target Converts source-specific data elements into standard formats and then stores them in a staging table Stores revenue transactions extracted from the source system

W_GL_REVN_FS Target Definition

4. Explore the parameters and variables for the mapping.

52

Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 8: Exploring SDE and SIL Mappings

a. Select Mappings > Parameters and Variables to open the Declare Parameters and Variables dialog box. b. Notice there are two parameters listed for this mapping: $$DATASOURCE_NUM_ID and $$TENANT_ID. Values for these parameters are passed by the DAC during run time. As you will see, these parameters are included in expressions for ports in the Exp_W_GL_REVN_FS_Integration_Id expression transformation. c. Click Cancel to close the Declare Parameters and Variables dialog box without making any changes. 5. Explore the mplt_BC_ORA_GLRevenueFact mapplet. a. Right-click mplt_BC_ORA_GLRevenueFact and select Open Mapplet to open it in the Mapplet Designer. This mapplet extracts revenue data from tables in the Oracle E-Business Suite source system. b. If desired, select Layout > Zoom Percent and select a percent to modify the layout in the Mapplet Designer workspace. c. Note that there are three component types in the mapplet and that the data flow is from left to right. The three component types are: source definition, source qualifier, and mapplet output transformation. d. Notice that there are four source definitions for this mapplet. Recall that source definitions represent tables or files that provide source data. Source definitions are imported into the Informatica repository via the Source Analyzer. In this example, there are four tables imported from the Oracle E-Business Suite source: RA_CUSTOMER_TRX_LINES_ALL, RA_CUST_TRX_LINE_GL_DIST_ALL, RA_CUSTOMER_TRX_ALL, and RA_SALESREPS_ALL. e. Double-click RA_CUSTOMER_TRX_ALL to open the Edit Transformations dialog box. f. Click the Ports tab. Notice that all ports are output ports, which means they pass data to the next object in the mapplet. Source definitions provide data, so they contain only output ports. g. Click Cancel to close the Edit Transformations dialog box. h. Double-click the SQ_GL_REVENUE_EXTRACT source qualifier to open the Edit Transformations dialog box. When you add a relational or a flat file source definition to a mapplet or mapping, you need to connect it to a source qualifier transformation. The source qualifier transformation represents the rows that the Informatica Integration Service reads when it runs a session. In this example, all four source definitions connect to a single source qualifier. i. Click the Ports tab. Notice that all ports are input/output. Input/output ports receive data and pass it unchanged. j. Click the Properties tab. k. For the Sql Query transformation attribute, click the down arrow in the Value field to open the SQL Editor to display the SQL statement that is used to retrieve the data from the sources. You can use the SQL Editor to view, modify, or validate the SQL. l. Scroll to the bottom of the SQL statement and notice that the SQL contains the $$LAST_EXTRACT_DATE parameter. Recall that this parameter is used to identify records that have changed since the last ETL run and that the value for this parameter is passed from the DAC during run time. (You define parameters and variables for mapplets similar to how you define them for mappings: by selecting Mapplets > Parameters and Variables to open the Declare Parameters and Variables dialog box.)
Oracle BI Applications 7.9: Implementation for Oracle EBS 53

Lesson 8: Exploring SDE and SIL Mappings

m. Click Cancel to close the SQL Editor. n. Click the Sources tab. Notice that this tab lists the source definitions associated with this source qualifier. o. Click Cancel to close the Edit Transformations dialog box. p. Double-click MAPO_GL_REVENUE_EXTRACT to open the Edit Transformations dialog box. q. Click the Ports tab. Notice that all ports are input. Input ports receive data. In this example, the MAPO_GL_REVENUE_EXTRACT mapplet output transformation is the target of this mapplet and receives data from the SQ_GL_REVENUE_EXTRACT source qualifier. The MAPO_GL_REVENUE_EXTRACT mapplet output transformation passes output from the mapplet to the next transformation in the mapping (Exp_W_GL_REVN_FS_Integration_Id Expression in this example). r. Click Cancel to close the Edit Transformations dialog box. 6. Explore the Exp_W_GL_REVN_FS_Integration_Id expression transformation. An expression transformation is used to calculate values in a single row before writing to the target. a. Select Tools > Mapping Designer. Alternatively, you can click the Mapping Designer icon on the toolbar. The SDE_ORA_GLRevenueFact mapping should still be open. b. Double-click the Exp_W_GL_REVN_FS_Integration_Id expression transformation to open the Edit Transformations dialog box. c. Click the Ports tab. Notice that port type varies by port. While most of the ports are input/output ports that receive data and pass it unchanged, there are also input ports, variable ports, which are used to store values across rows, and output ports with expressions. d. Scroll to locate the DATASOURCE_NUM_ID port. Notice the value in the Expression field. DATASOURCE_NUM_ID is an output port that gets its value from the $$DATASOURCE_NUM_ID parameter. Recall that this value is a unique identifier for each data source and is passed by the DAC during run time. e. Scroll to locate the TENANT_ID port. f. Click the down arrow in the Expression field to open Expression Editor. Notice that the parameter $$TENANT_ID is used in an IIF conditional expression. g. Notice the Functions, Ports, and Variables tabs in the left pane, and the Numeric and Operator keypads. You can use these to build expressions in the Expression Editor. h. Click Cancel to close Expression Editor. i. Click Cancel to close the Edit Transformations dialog box. 7. Explore the mplt_SA_ORA_GLRevenueFact mapplet. a. Select Tools > Mapplet Designer. Alternatively, you can click the Mapplet Designer icon on the toolbar. b. If necessary, expand Explore_SDE > Mapplets. c. Drag the mplt_SA_ORA_GLRevenueFact mapplet into the Mapplet Designer. This is a Source Adapter mapplet, which converts source-specific data elements into standard formats and then stores them in a staging table. The source independent load (SIL) mapping then uses an Analytic Data Interface (ADI) mapplet to pick up these records, which are already transformed into standard format.

54

Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 8: Exploring SDE and SIL Mappings

d. Note that there are five components in the mapplet and that the data flow is from left to right. The components are two unconnected lookup procedures, a mapplet input transformation, an expression transformation, and a mapplet output transformation. e. Notice the LKP_LOC_CURRENCY and LKP_CUSTLOC_CUST_LOC_ID lookup transformations. These are unconnected lookup transformations. An unconnected lookup transformation is a stand-alone transformation that is not linked to other transformations in a mapping or mapplet. A lookup transformation is called in an expression that uses the :LKP reference qualifier. f. Double-click the LKP_LOC_CURRENCY lookup transformation to open the Edit Transformations dialog box. g. Click the Properties tab and notice that the lookup table name is W_LOC_CURR_CODE_TMP. This lookup transformation retrieves the local currency code from W_LOC_CURR_CODE_TMP. h. Click Cancel to close the Edit Transformations dialog box. i. Double-click the MAPI_SAI_GL_REVENUE input transformation. j. Click the Ports tab and notice that all ports are output. This transformation receives the output of the Exp_W_GL_REVN_FS_Integration_Id expression transformation in the SDE_ORA_GLRevenueFact mapping and passes the data to the EXP_SAI_GL_REVENUE expression transformation in this mapplet. The ports have been renamed with the INP_ prefix. k. Click Cancel to close the Edit Transformations dialog box. l. Double-click the EXP_SAI_GL_REVENUE expression transformation. m. Click the Ports tab. Scroll through the ports and notice that if the input data is transformed, the data is passed to the expression transformation as input only. After the data is transformed, it is output through a new port, which is prefixed with EXT_. n. Locate the EXT_CUST_SOLD_TO_ID port and click the down arrow in the Expression field to open the Expression Editor. Notice that the :LKP reference qualifier is used in the expression to return values from the LKP_CUSTLOC_CUST_LOC_ID unconnected lookup transformation. Notice also that the TO_CHAR function is used to convert the data. o. Click Cancel to close the Expression Editor. p. Click Cancel to close the Edit Transformations dialog box. q. Double-click the MAPO_SAI_GL_REVENUE output transformation. r. Click the Ports tab and notice that all ports are input ports with the EXT_ prefix. These ports exactly match the input ports of an Analytic Data Interface (ADI) mapplet in the corresponding SIL mapping. s. Click Cancel to close the Edit Transformations dialog box. 8. Explore the target definition. on the toolbar. a. Select Tools > Mapping Designer or click the Mapping Designer icon The SDE_ORA_GLRevenueFact mapping should still be open. b. Double-click the W_GL_REVN_FS target definition. c. Click the Ports tab and notice that all ports are input ports. d. Click Cancel. e. Notice that the EXT_*output ports of the mplt_SA_ORA_GLRevenueFact mapplet are used to populate the ports in the W_GL_REVN_FS target definition. This is the fact staging table
Oracle BI Applications 7.9: Implementation for Oracle EBS 55

Lesson 8: Exploring SDE and SIL Mappings

in the Oracle Business Analytics Warehouse. This table is one of the source definitions in the corresponding SIL mapping, SIL_GLRevenueFact. f. Select Repository > Exit to close Informatica PowerCenter Designer.

56

Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 8: Exploring SDE and SIL Mappings

Practice 8-2 Exploring a Prebuilt SIL Mapping


Goals Scenario To explore a prebuilt source independent load (SIL) mapping in the Oracle Informatica repository The main goal of this practice is to explore the anatomy of a typical, prebuilt SIL mapping in the Informatica repository. SIL mappings select data from Oracle Business Analytics Warehouse (OBAW) staging tables, perform transformations, and load data into OBAW tables. SIL mappings differ from SDE mappings with regard to source and target tables. SIL mappings are source independent in that they do not select data from the transactional source, but rather from the warehouse staging tables. Whereas the targets of SDE mappings are OBAW staging tables, the targets of SIL mappings are the final OBAW tables. Also, SIL mappings are typically more complex than SDE mappings, because the data undergoes more transformation before being passed to the target. An understanding of the anatomy of prebuilt SIL mappings will assist you in the configuration and customization of ETL mappings. This practice explores the function and components of a typical SIL mapping. Note that individual mappings may differ substantially from the examples provided here. Time 1520 minutes

Instructions:
1. Open Informatica Repository Manager and create a new Informatica repository folder. a. Select Start > Programs > Informatica PowerCenter 8.1.1 > Client > PowerCenter Repository Manager. b. Double-click Oracle_BI_DW_Base. c. Enter Administrator as the username and password and click Connect. d. Select Folder > Create. e. Name the new folder Explore_SIL and click OK. f. Click OK to confirm that the folder has been successfully created. g. Select Explore_SIL in the Repository Navigator. h. Select Tools > Designer to open Designer. i. Verify that the new Explore_SIL folder is visible in the Repository Navigator in Designer. j. Close Repository Manager. 2. Copy a mapping to the new folder. a. Verify that the Explore_SIL folder is open in Designer. The folder is open if it is bolded and appears as Explore_SIL (Oracle_BI_DW_Base) in the list in the upper-left corner of the Designer toolbar. b. Expand the Explore_SIL folder. Notice that at this point it contains empty subfolders. c. Expand SILOS > Mappings. d. Select the mapping SIL_GLRevenueFact. e. Select Edit > Copy.
Oracle BI Applications 7.9: Implementation for Oracle EBS 57

Lesson 8: Exploring SDE and SIL Mappings

f. Select the Explore_SIL folder. g. Select Edit > Paste. h. Click Yes to confirm that you want to copy SIL_GLRevenueFact. Again, you are copying this mapping to a custom folder as a precaution to ensure that you do not inadvertently make changes to the mapping in the SILOS folder while you explore it. i. Expand Explore_SIL > Mappings and confirm that the SIL_GLRevenueFact mapping is copied to the Explore_SIL folder.

j. Expand the subfolders under Explore_SIL and notice that all of the related repository objects for this mapping were also copied to the folder: sources, targets, transformations, and mapplets. k. Select Repository > Save. 3. Explore the SIL_GLRevenueFact mapping. a. Open Mapping Designer in the workspace. b. If necessary, expand Explore_SIL > Mappings. c. Drag SIL_GLRevenueFact into Mapping Designer. This mapping moves data from a fact staging table in the Oracle Business Analytics Warehouse (OBAW) into the W_GL_REVN_F fact table in the OBAW. The data undergoes significant changes in this mapping (through several mapplets, lookups, and transformations) as it is moved from the source definition tables to the target table. d. If desired, select Layout > Zoom Percent and select a percent to modify the mapping layout in the Mapping Designer workspace. e. Notice that the SIL mapping has many of the same transformation objects as the SDE mapping you explored in the previous practice: source definition, source qualifier, mapplet, lookup procedure, expression, and target definition. In addition, the SIL mapping contains the following transformation objects: filter and update strategy. In the remainder of this practice, you explore the objects and the general flow of the SIL mapping. 4. Explore the source definitions. Notice that the source definitions are all located in the OBAW. They include dimension tables, which are populated by dimension SIL mappings, and the W_GL_REVN_FS fact staging table, which is populated by the SDE_ORA_GLRevenueFact mapping that you explored in the previous practice.
58 Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 8: Exploring SDE and SIL Mappings

5. Explore the Sq_W_GL_REVN_FS source qualifier transformation. This source qualifier transformation performs the same basic function as the source qualifier in the SDE mapping. The source qualifier transformation represents the row set retrieved from the source objects before undergoing subsequent transformations. The Sql Query transformation attribute, accessed on the Properties tab, contains the SQL statement that is used to retrieve data from the sources. If you would like to explore this source qualifier transformation in more depth, double-click it to open the Edit Transformations dialog box. 6. Explore the MPLT_GET_ETL_PROC_WID mapplet. This is a reusable mapplet that looks up and retrieves the ETL_PROC_WID from the W_PARAM_G table in the OBAW. ETL_PROC_WID stores the ID of the ETL process information, which uniquely identifies each ETL run. This ID is also displayed as the Process ID on the Current Run / Run History screen in the DAC. The value for ETL_PROC_WID is passed by the DAC during run time to the $$ETL_PROC_WID parameter in the MPLT_GET_ETL_PROC_WID mapplet. If you would like to explore the processing for this mapplet in more depth, open it in Mapplet Designer. 7. Explore the Lkp_W_GL_REVN_F lookup procedure. This cached lookup transformation is responsible for looking up the required DATASOURCE_NUM_ID and INTEGRATION_ID columns in the target table, W_GL_REVN_F. Together, these two columns are the primary key that uniquely identifies rows in the target table. These columns are compared against the corresponding staging area columns to detect new records or identify changed records. Rows that have the same INTEGRATION_ID and DATASOURCE_NUM_ID will be updated in the target table. The rest of the rows will be inserted. The SQL statement of this lookup transformation is overridden with a join with the staging area table, W_GL_REVN_FS, to minimize the number of records to be cached by this lookup. If you would like to explore this lookup procedure in more depth, double-click it to open the Edit Transformations dialog box. 8. Explore the Exp_W_GL_REVN_F_Update_Flg expression transformation. a. Double-click the Exp_W_GL_REVN_F_Update_Flg expression transformation to open the Edit Transformations dialog box. The description explains that this expression transformation evaluates the value of the Update Flag port, UPDATE_FLG, which is used to take actions such as insert and delete in the update strategy transformation, which is located downstream in this mapping. The table lists the possible values for the UPDATE_FLG port:
I B U D X Insert new record Insert new record and mark for soft delete Update existing record Update existing record and mark for soft delete Reject

b. Click the Ports tab and select the UPDATE_FLG port. c. Read the description for an understanding of the evaluation logic of the port. d. Click the down arrow in the Expression field to open the Expression Editor and view the evaluation logic for the port. e. Click Cancel to close the Expression Editor. f. Click Cancel to close the Edit Transformations dialog box.

Oracle BI Applications 7.9: Implementation for Oracle EBS

59

Lesson 8: Exploring SDE and SIL Mappings

9. Explore the Fil_W_GL_REVN_F filter transformation. This filter transformation uses an IIF formula to filter out records that have an UPDATE_FLG value of X. To view the filter condition logic, open the transformation, click the Properties tab, and click the down arrow in the Value field of the Filter Condition transformation attribute. 10. Explore the mplt_Curcy_Conversion_Rates mapplet. This mapplet is responsible for getting the correct exchange rates for a given date, where a currency conversion to any of the three possible global currencies is involved. Currency conversions are required because your business might have transactions involving multiple currencies. The OBAW stores amounts in document currency (the currency of the transaction), local currency (the currency in which accounting entries are recorded), and global currencies. Out of the box, Oracle Business Intelligence Applications provides three global currencies, which are the common currencies used by the OBAW. For every monetary amount extracted from the source, the load mapping loads the document currency and local currency amounts into the target table. It also loads the exchange rates required to convert the document amount into each of the three global currencies. Thus, the target table has two amount columns and three exchange rate columns. You will learn more about configuring global currencies in Lesson 10 Configuring Analytical Applications. If you would like to explore the processing for this mapplet in more depth, open it in Mapplet Designer. 11. Explore the EXP_Custom expression transformation. This expression transformation is used when you need to customize the mapping by adding columns to an existing fact or dimension table. In order to see additional columns in the data warehouse, the columns must first be passed through the ETL process. Both the prebuilt ETL mappings and OBAW tables are extensible. Oracle BI Applications provides a methodology to extend the prebuilt ETL mappings to include additional columns and load the data into existing OBAW tables. This EXP_Custom expression transformation is part of that methodology, and it has a single placeholder column, X_CUSTOM, which marks a safe path through the mapping. All extension logic should follow the same route through the mapping as X_CUSTOM. You can add additional transformations to the mapping, but they should follow the same route through the mapping as X_CUSTOM. This process of customizing and extending the ETL mappings and the OBAW is covered in depth in lessons 1214. 12. Explore the MPLT_LKP_W_CUSTOMER_FIN_PROFL_D mapplet. This mapplet does a lookup to resolve the CUSTOMER_FIN_PROFL_WID column, which is the key to the customer accounts dimension in W_GL_REVN_F. If you would like to explore the processing for this mapplet in more depth, open it in Mapplet Designer. 13. Explore the mplt_SIL_GLRevenueFact mapplet. This mapplet is responsible for transforming specific types of columns in the target table W_GL_REVN_F. Jobs done in this mapplet include code-name pair resolution, dimension surrogate key resolution, system columns generation, and so forth. If you would like to explore the processing for this mapplet in more depth, open it in Mapplet Designer. 14. Explore the Upd_W_GL_REVN_F_Ins_Upd update strategy transformation. This transformation is used to decide whether incoming rows should be inserted or updated in the target table based on the value of the UPDATE_FLG port. Recall that the value of this port was set for each row in the Exp_W_GL_REVN_F_Update_Flg expression transformation. To view the update strategy logic, open Upd_W_GL_REVN_F_Ins_Upd, click the Properties tab, and open the expression for the Update Strategy Expression attribute. The logic is as follows: If
60 Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 8: Exploring SDE and SIL Mappings

UPDATE_FLG for a row is set to I (insert new record) or B (insert new record and mark for soft delete), the record is flagged for insertion. If UPDATE_FLG for a row is set to D (update existing record and mark for soft delete) or U (update existing record), the record is flagged for update. All other records are flagged for rejection. 15. Explore the W_GL_REVN_F target definition. This fact table stores all the revenue transactions for the Oracle BI Financial Analytics application in the OBAW. All the transaction amounts are stored in document currency and local currency and the table also maintains three global currency exchange rates. The ACCT_DOC_TYPE_WID port in this table is the foreign key to the W_XACT_TYPE_D table. It distinguishes the transaction type of the record. The DOC_STATUS_WID port is the foreign key to the W_STATUS_D table. It is used to determine the status of the record, whether it is open, cleared, posted, or unposted. 16. That completes your exploration of a typical SIL mapping. Close all open Informatica clients.

Oracle BI Applications 7.9: Implementation for Oracle EBS

61

Lesson 8: Exploring SDE and SIL Mappings

62

Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 9: Working with the Data Warehouse Administration Console

Practice 9-1: Exploring the DAC


Goal Scenario To explore the Oracle BI Data Warehouse Administration Console (DAC) to become familiar with its functionality, objects, and properties The DAC provides a framework for the entire life cycle of data warehouse implementations. It allows you to create, configure, execute, and monitor modular data warehouse applications in a parallel, high-performing environment. All DAC repository objects are associated with a source system container, which stores application objects in a hierarchical framework that defines a data warehouse application. The DAC allows you to view and modify the repository application objects based on the source system container you specify. 2030 minutes

Time

Instructions:
1. Open the DAC client. a. Select Start > Programs > Oracle Business Intelligence > Oracle DAC > DAC Client. b. Log in to the DAC connection using dac as the table owner name and password. 2. Explore the Design view. a. If necessary, click Design on the toolbar to navigate to the Design view. The DAC Design view provides access to functionality related to creating and managing subject areas. b. If necessary, select the Oracle 11.5.10 source system container in the list. When the Design view is active, the Source System Container list appears to the right of the view buttons. It allows you to select the source system container that holds the repository objects that correspond to a specific source system. c. The navigation tree is visible on the left side of the DAC window. The navigation tree displays all the metadata corresponding to the selected source system container. You cannot change the metadata for preconfigured containers. If you want to customize the metadata in a preconfigured container, you must first make a copy of the container. You learn how to create a custom container and the associated metadata in Lesson 11: Customizing DAC Metadata and Running an Execution Plan. d. Notice that the tree root nodes in the navigation tree correspond to the tabs in the top pane of the DAC window on the right: Subject Areas, Tables, Indices, and so forth. e. Expand the Subject Areas node in the navigation tree. f. If necessary, click the Subject Area tab in the top pane. Notice that the subject areas listed in the navigation tree correspond to the subject areas listed in the list. These are all the subject areas associated with the selected source system container. g. Double-click Tables in the navigation tree. This expands the navigation tree to display all the tables corresponding to this container, automatically selects the Tables tab in the top pane, and displays all the tables in list mode in the list.
Oracle BI Applications 7.9: Implementation for Oracle EBS 63

Lesson 9: Working with the Data Warehouse Administration Console

h. Double-click any one of the tables in the navigation tree. The table is displayed in singlerecord mode in the list. i. Double-click Tables in the navigation tree to return to list mode. j. Click the Name column to sort the table records by name. k. If desired, drag column headings to reorder them. l. Click the Query button. m. In the Name field, enter W_GL_REVN_F and click Go. Querying is a way to locate one or more records that meet your specified criteria. Query functionality is available in every DAC screen. You can use query commands and operators to define your query criteria. n. Right-click anywhere in the list to view the right-click menu. The commands available in right-click menus depend on which tab is active. Do not select any right-click commands. Close the right-click menu. 3. Explore source system folders. a. Click the Source System Folders tab. b. Notice there are three logical folders, Extract, Load, and Post Load, listed for the Oracle 11.5.10 container and that each logical folder points to a physical folder. Where are the physical folders located?

4. Explore subject areas. a. Double-click Subject Areas in the navigation tree or click the Subject Areas tab. The Subject Areas tab lists all the subject areas associated with the selected source system container. A subject area is a logical grouping of tables related to a particular subject or application context, as well as the tasks that are associated with the tables. Subject areas are assigned to execution plans, which can be scheduled for full or incremental loads. A subject area also includes the tasks required to load the subject area tables. b. Notice that some of the DAC interface tabs have common elements, such as columns or subtabs. The following table provides a description of some of these common elements:
Name Inactive Owner Edit Description A column that specifies the name of the repository, warehouse, or transactional database object A column that indicates whether an object is inactive. Inactive objects do not participate in the ETL process. A column that specifies the source system container in which the object was created A subtab that allows you to edit an object that is selected in the top pane window (preconfigured objects cannot be edited) A subtab that displays and allows you to view or edit a description of the object selected in the top pane

c. Notice some of the additional properties associated with each subject area. The following table provides a description of these properties:
Configuration Tag Indicates whether configuration tag tasks are the only tasks associated Tasks Only with this subject area that will participate in the ETL process. If this check box is selected, only the tasks associated with the configuration tag will be chosen by the DAC when the subject area is assembled.
64 Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 9: Working with the Data Warehouse Administration Console

Last Designed

Indicates the date and time the subject area was last assembled

d. Click the Financials - Revenue subject area. e. If necessary, click the Edit subtab. Notice that some of the subject area properties are displayed in this tab. f. Click the Description subtab to view a brief description of the subject area. g. Click the Tasks subtab. This tab displays the tasks associated with the selected subject area, and allows you to add, remove, and inactivate tasks for a custom container. It includes the following properties:
Parent Group Phase Autogenerated Is Group Displays the task group name if the task belongs to a task group Identifies the task phase of the ETL process Indicates whether the task was generated by the DACs task generation process Indicates whether the task is a task group

h. How many of the tasks associated with the Financials - Revenue subject area are a task group? Hint: Use the query feature.

i. Click the Tables subtab. This tab displays the tables that are associated with the selected subject area. It allows you to add or remove tables for custom containers. j. Click the Configuration Tags subtab. This tab displays the configuration tags that are associated with this subject area. It includes the following properties:
Include Tasks Indicates whether the configuration tag tasks will be executed with the selected subject area

Context Disabled Indicates (if checked) that the configuration tag is globally disabled (set as Inactive in the Configuration Tags parent tab)

5. Explore tables. a. Click the Tables tab in the top pane. The Tables tab lists all the tables associated with the selected source system container. Tables in the DAC are physical database tables defined in the database schema. These can be transactional database tables or data warehouse tables. Table types can be fact, dimension, hierarchy, aggregate, and so on, as well as flat files that can be sources or targets. The Tables tab allows you to view and edit existing tables and to create new tables for custom containers. b. Notice the properties associated with each table. The following table provides a description of properties specific to tables:
Table Type Warehouse Image Suffix Is MultiSet Indicates the type of table: file, source, dimension, aggregate, and so on Indicates whether the table is a warehouse table. If the warehouse flag is not selected, the schema creation process will not include this table. Indicates the suffix for image tables; applicable to Siebel source tables only Indicates whether the table is a MultiSet table; applicable only to Teradata databases

Oracle BI Applications 7.9: Implementation for Oracle EBS

65

Lesson 9: Working with the Data Warehouse Administration Console

Has Unique Primary Index

Indicates whether the table has a Unique Primary Index; applicable only to Teradata databases

c. Click the Query button. d. Enter W_GL_REVN_F in the Name field and click Go. Recall that this is a fact staging table. e. Click the Related Tables subtab. Related tables participate in the ETL process in addition to the tables that are associated with this table. f. Click the Columns subtab. This tab lists the columns associated with the selected table. It also allows you to add columns and foreign key column relationships to the selected table for custom containers. g. Click the Multi-Column Statistics subtab. This tab is applicable to Teradata databases only. h. Click the Indices (RO) subtab. This tab displays a read-only list of indices that belong to the selected table. i. Click the Source for Tasks (RO) subtab. This tab displays a read-only list of tasks that use the selected table as a source. j. Click the Target for Tasks (RO) subtab. This tab displays a read-only list of tasks that use the selected table as a target. k. Click the Conditional for Tasks (RO) subtab. This tab displays a read-only list of tasks that are optional tasks for the selected table. 6. Explore indices. a. Click the Indices tab in the top pane. The Indices tab displays a list of all the indices associated with the selected source system container. It is recommended that you do not register any indices for source tables. During the ETL process, when a table is going to be truncated, all the indices as defined in the repository will be dropped before the data is loaded and will be created after the data is loaded automatically. While this improves the ETL performance, the preconfigured workflows have the bulk load option turned on. The bulk load will fail if there are indices on the table. Therefore, it is important to keep the index definitions in sync with the database. For example, if you create an index on the database, and it is not registered in the repository, the index will not be dropped and the load will fail. b. Notice the properties associated with each index. The following table provides a description of properties specific to indices:
Table Name Index Usage Table for which an index is created Usage of index: ETL or Query. An ETL index is typically used during the ETL process. A Query index is an index used only during the reporting process. You should have a clear understanding of when and where the index will be used at the time of registering the index. For unique indices, the number of columns that will be unique Indicates whether the index is unique Indicates whether the index is clustered. There can be only one clustered index per table. Indicates whether the index is of the bitmap type Applicable only for DB2-UDB databases. The index will be created with the Allow Reverse Scan option.

# Unique Columns Is Unique Is Clustered Is Bitmap Allow Reverse Scan

66

Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 9: Working with the Data Warehouse Administration Console

Always Drop & Create

Indicates whether the index will be dropped and created regardless of whether the table is being loaded using a full load or incremental load

c. Index properties are specific to different database types. You can use the list to display database-specific properties. d. Click the Columns subtab. This tab displays the list of columns that make up the index. It includes the following properties:
Position Sort Order Indicates the position of the column in the index Indicates whether the sort order is ascending or descending

e. Click the Databases subtab. This tab lists the database types that apply to the selected index. If no database type is indicated, the index will not be created. 7. Explore tasks. a. Click the Tasks tab in the top pane. The Tasks tab lists all the tasks associated with the selected source system container. A task is a unit of work for loading one or more tables. A task comprises the following: source and target tables, phase, execution type, truncate properties, and commands for full or incremental loads. When you assemble a subject area, the DAC assigns tasks to it. Tasks that are automatically assigned to the subject area by the DAC are indicated by the Autogenerated flag in the Tasks subtab of the Subject Areas tab. You learn more about assembling subject areas in Lesson 11 Customizing DAC Metadata and Running an Execution Plan. b. Query for the SDE_ORA_GLRevenueFact task. c. Notice the Command for Incremental Load and Command for Full Load properties. A table can be loaded in full mode or incremental mode. Full mode refers to data loaded for the first time or data that is truncated and then loaded. Incremental mode refers to new or changed data being added to the existing data. The load commands for this task are SDE_ORA_GLRevenueFact and SDE_ORA_GLRevenueFact_Full. Recall that these commands correspond to the Informatica workflows that execute the tasks for the Informatica mapping SDE_ORA_GLRevenueFact. d. Notice the folder name, Extract. Recall that Extract is the logical name of the source system folder that points to the physical folder SDE_ORA11510_Adaptor in the Informatica repository (Oracle_BI_DW_Base). e. Notice the Primary Source and Primary Target properties. These are the logical database connections for the primary source database and primary target database, respectively. You learned about setting up these connections when you configured the DAC in Practice 4-1: Configuring the Training Environment. f. In which phase of the ETL process does this task occur?

g. Is this task executed via an external program, SQL file, Informatica, or a stored procedure?

h. Are there any pre- or post-SQL commands executed with this task?

i. If this task fails, will dependent tasks be executed?


Oracle BI Applications 7.9: Implementation for Oracle EBS 67

Lesson 9: Working with the Data Warehouse Administration Console

j. Are the source tables for this task located in the transactional database or the data warehouse?

k. Which of the following is not a source table for this task? RA_CUSTOMER_TRX_ALL RA_CUSTOMER_TRX_LINES_ALL RA_CUST_TRX_LINE_GL_DIST_ALL W_GL_REVN_F

l. What is the target table for this task?

m. What type of table is the target table?

n. Query for the SIL_GLRevenueFact task. o. In which phase of the ETL process does this task occur?

p. Is this task executed via an external program, SQL file, Informatica, or a stored procedure?

q. Are the source tables for this task located in the transactional database or the data warehouse?

r. Which source table for this task is the target table for the SDE_ORA_GLRevenueFact task?

s. What is the primary source table for this task and what kind of table is it?

t. What is the target table for this task?

u. What type of table is the target table?

v. True or false? The target table is truncated regardless of whether a full or incremental load is run.
68 Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 9: Working with the Data Warehouse Administration Console

w. What happens to indices when the target table is truncated?

8. Explore task groups. a. Click the Task Groups tab. A task group is a group of tasks that you define because you want to impose a specific order of execution. A task group is considered to be a special task. b. Select TASK_GROUP_Extract_BusinessLocationDimension in the list. c. Click the Child Tasks subtab. This tab lists all the tasks that belong to the selected task group. The execution order identifies the order in which the tasks are executed. d. Click the Source Tables (RO) tab. This tab displays a read-only list of the tables used for getting data by the task group. This tab also identifies the tasks to which the tables belong. e. Click the Target Tables (RO) tab. This tab displays a read-only list of the tables into which the task group loads data. This tab also identifies the tasks to which the tables belong. 9. Explore configuration tags. a. Click the Configuration Tags tab. A configuration tag is an object that controls the inclusion of tasks in subject areas. When a task is tagged, it is not eligible to be included in the collection of tasks for any subject area, unless the tag is part of the subject area definition Include Task property. b. Scroll to the bottom of the list and select the Oracle Extract Value Set Hierarchies configuration tag. c. Click the Subject Areas subtab to view the subject areas that belong to a configuration tag or to add subject areas to a configuration tag for custom containers. d. Notice the Configuration Tag Tasks Only field. This field indicates whether configuration tag tasks are the only tasks associated with this subject area that will participate in the ETL process. If this check box is selected, only the tasks associated with the configuration tag will be chosen by the DAC when the subject area is assembled. e. Click the Tasks subtab, which lists the tasks associated with the configuration tag selected in the top window. 10. Explore source system parameters. a. Click the Source System Parameters tab. The Source Systems Parameters tab lists all the source system parameters associated with the selected source system container. It allows you to edit existing parameters and to configure new ones for custom containers. b. Scroll through the source system parameters to get a sense of the parameters and their values. For example, notice the $$DATASOURCE_NUM_ID parameter, which you worked with in
Oracle BI Applications 7.9: Implementation for Oracle EBS 69

Lesson 9: Working with the Data Warehouse Administration Console

earlier practices. It is used to populate DATASOURCE_NUM_ID in Informatica mappings during run time. 11. Explore the Setup view. a. Click Setup on the toolbar to navigate to the Setup view. The Setup View provides access to functionality related to setting up DAC system properties, Informatica servers, database connections, and email notification. You already worked with some of these properties when you configured the DAC in Practice 4-1: Configuring the Training Environment. b. Click the DAC Systems Properties tab. This tab enables you to configure various properties that determine the behavior of the DAC server. The following table lists a few of the properties and their function. For more information about DAC systems properties, refer to the Oracle Business Intelligence Data Warehouse Administration Console Guide.
Property Analyze Frequency (in days) Description For DAC metadata tables, the frequency (in days) at which the DAC client updates the table and index statistics for the DAC repository. The value must be numerical. When set to True: An ETL that is running when the DAC server abnormally terminates will continue running when the DAC server is restarted. When set to False: An ETL that is running when the DAC server abnormally terminates will not automatically restart when the DAC server restarts. The ETL status will be updated to Failed. An administrator will have to manually restart the ETL. DAC Server Host The host name of the machine where the DAC server resides. You cannot use an IP address for this property. The DAC server and a given DAC repository have a one-to-one mapping; that is, you can only run one DAC server against any given DAC repository. Thus, in the repository you must specify the network host name of the machine where the DAC server is to be run. This property also takes the value localhost. However, this value is provided for development and testing purposes and should not be used in a production environment. The directory where the Informatica parameter file is stored A unique name for the DAC repository

Auto Restart ETL

InformaticaFileParameterLocation Repository Name

c. Click the Informatica Servers tab. The Informatica Servers tab enables you both to register one or more Informatica servers and one Informatica Repository server and to specify how many workflows can be executed in parallel on each server. The DAC server load balances across the servers. d. Click the Physical Data Sources tab. The Physical Data Sources tab provides access to the connection properties for the physical data sources. In this tab, you can view and edit existing physical data source connections and create new ones. e. Click the Email Recipients tab. This tab enables you to set up a list of email addresses that will be notified about the status of the ETL process.
70 Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 9: Working with the Data Warehouse Administration Console

12. Explore the Execute view. a. Click Execute on the toolbar to navigate to the Execute view. The Execute View provides access to functionality that allows you to run, schedule, and monitor execution plans. b. If necessary, click the Execution Plans tab in the top pane to view the existing execution plans. c. Select the Financials_Oracle 11.5.10 execution plan. d. Does the Financials_Oracle 11.5.10 execution plan always execute a full load?

e. Is the Financials_Oracle 11.5.10 execution plan active?

f. What is the number of prune days assigned to this execution plan?

Recall that the DAC subtracts the number of prune days from the LAST_REFRESH_ DATE of a given source and supplies this as the value for the $$LAST_EXTRACT_DATE parameter. g. Will the tables associated with this execution plan be analyzed?

h. Is the Financials - Revenue subject area associated with this execution plan?

i. In which Informatica folder are the SDE mappings located? Hint: Click the Parameters tab.

j. Which tasks must be completed before this execution plan is run?

k. Which tasks must be completed after this execution plan is run?

l. Click the Ordered Tasks subtab. m. Does the SIL_EmployeeDimension task execute before or after the SIL_GLRevenueFact task?

n. How do you determine this?

o. What is the primary source for the SIL_GLRevenueFact task?

Oracle BI Applications 7.9: Implementation for Oracle EBS

71

Lesson 9: Working with the Data Warehouse Administration Console

p. What is the primary target for the SIL_GLRevenueFact task?

q. In which task phase of the ETL process is SIL_GLRevenueFact executed?

r. Which ETL mappings (commands) are used to run the SIL_GLRevenueFact task?

s. If necessary, select the SIL_GLRevenueFact task in the Ordered Tasks subtab. t. Click Details. u. How many tasks are immediate successors to the SIL_GLRevenueFact task?

v. What is the target table for the SIL_GLRevenueFact task?

w. Is W_GL REVN_FS a source table for the SIL_GLRevenueFact task?

x. Close Details. y. Leave the DAC open. You will learn about the remaining Execute tabs (Current Run, Run History, Scheduler) in the practices for Lesson 11 Customizing DAC Metadata and Running an Execution Plan.

72

Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 9: Working with the Data Warehouse Administration Console

Solutions 9-1: Exploring the DAC


Answers
3.b. Notice there are three logical folders, Extract, Load, and Post Load, listed for the Oracle 11.5.10 container and that each logical folder points to a physical folder. Where are the physical folders located? In the Oracle Informatica repository (Oracle_BI_DW_Base) How many of the tasks associated with the Financials - Revenue subject area are a task group? Hint: Use the query feature. 16 In which phase of the ETL process does this task occur? Extract Is this task executed via an external program, SQL file, Informatica, or a stored procedure? Informatica Are there any pre- or post-SQL commands executed with this task? No If this task fails, will dependent tasks be executed? No. The Continue on Error property is not selected. When this check box is selected, if the command fails, the dependent tasks are not stopped. Are the source tables for this task located in the transactional database or the data warehouse? Transactional Which of the following is not a source table for this task? W_GL_REVN_F What is the target table for this task? W_GL_REVN_FS What type of table is the target table? Fact staging table In which phase of the ETL process does this task occur? Load fact Is this task executed via an external program, SQL file, Informatica, or a stored procedure? Informatica

4.h

7.f. 7.g. 7.h. 7.i.

7.j.

7.k. 7.l. 7.m. 7.o. 7.p.

Oracle BI Applications 7.9: Implementation for Oracle EBS

73

Lesson 9: Working with the Data Warehouse Administration Console

7.q.

Are the source tables for this task located in the transactional database or the data warehouse? Data warehouse Which source table for this task is the target table for the SDE_ORA_GLRevenueFact task? W_GL_REVN_FS What is the primary source table for this task and what kind of table is it? W_GL_REVN_FS, fact staging table What is the target table for this task? W_GL_REVN_F What type of table is the target table? Fact table True or false? The target table is truncated regardless of whether a full or incremental load is occurring. False. Only Truncate for Full Load is selected. This indicates that the target table will be truncated only when a full load is occurring. What happens to indices when the target table is truncated? Any indices registered for this table are dropped before the command is executed and then re-created after the command completes successfully. When indices are dropped and created, the table is analyzed so that the index statistics are up-to-date. Does the Financials_Oracle 11.5.10 execution plan always execute a full load? No. The Full Load Always property is not selected. Is the Financials_Oracle 11.5.10 execution plan active? Yes. The Inactive property is not selected. What is the number of prune days assigned to this execution plan? 30 Will the tables associated with this execution plan be analyzed? Yes. The Analyze property is selected. Is the Financials - Revenue subject area associated with this execution plan? Yes. Click the Subject Areas subtab to see subject areas associated with this execution plan. In which Informatica folder are the SDE mappings located? SDE_ORA11510_Adaptor. Click the Parameters subtab to see the selected execution plans parameters for database connections and Informatica folders. Which tasks must be completed before this execution plan is run? None. There are no tasks listed in the Preceding Tasks subtab.
Oracle BI Applications 7.9: Implementation for Oracle EBS

7.r.

7.s. 7.t. 7.u. 7.v.

7.w.

12.d. 12.e. 12.f. 12.g. 12.h.

12.i.

12.j.

74

Lesson 9: Working with the Data Warehouse Administration Console

12.k. 12.m.

Which tasks must be completed after this execution plan is run? None. There are no tasks listed in the Following Tasks subtab. Does the SIL_EmployeeDimension task execute before or after the SIL_GLRevenueFact task? Before How do you determine this? One way is to examine the Depth property in the Ordered Tasks tab. The Depth property determines the level of a tasks dependency. Tasks that have no dependency have a depth of 0. Tasks that depend on other tasks of depth have a depth of 1, and so on. The SIL_EmployeeDimension has a depth of 26, so it executes before the SIL_GLRevenueFact task, which has a depth of 29. What is the primary source for the SIL_GLRevenueFact task? Data Warehouse What is the primary target for the SIL_GLRevenueFact task? DataWarehouse In which task phase of the ETL process is SIL_GLRevenueFact executed? Load Fact Which ETL mappings (commands) are used to run the SIL_GLRevenueFact task? SIL_GLRevenueFact_Full for full loads; SIL_GLRevenueFact for incremental loads How many tasks are immediate successors to the SIL_GLRevenueFact task? Two: SDE_ORA_Stage_GLRevenueFact_AGGRDerive and SDE_ORA_Stage_GLRevenueFact_GRFDerive What is the target table for the SIL_GLRevenueFact task? W_GL_REVN_F Is W_GL REVN_FS a source table for the SIL_GLRevenueFact task? Yes, it is the primary source.

12.n.

12.o. 12.p. 12.q. 12.r. 12.u.

12.v. 12.w.

Oracle BI Applications 7.9: Implementation for Oracle EBS

75

Lesson 9: Working with the Data Warehouse Administration Console

76

Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 10: Configuring Analytical Applications

Practice 10-1: Configuring Common Areas and Dimensions Before Running a Full Load
Goal To explore pre-load configuration steps that apply to Oracle Business Intelligence Applications deployed with any source system This practice explores configuration settings for Oracle Business Intelligence that you need to apply for any applications you deploy (for example, Oracle Financial Analytics, Oracle Human Resources). This includes steps required before a full data load and steps for controlling your data set. You explore the configuration settings in the pre-built Oracle 11.5.10 source container. 1015 minutes

Scenario

Time

Instructions:
1. Configure the initial extract date. The initial extract date is required when you extract data for a full load. It reduces the volume of data in the initial load. The specified initial extract date will be used as a filter on the creation date of OLTP data in the selected full extract mapping. When you set the Initial Extract Date parameter, make sure that you set it to the beginning of an accounting period, and not a date in the middle of an accounting period. In this example, you want to extract data from the first fiscal month of 2004, which begins December 1, 2003. a. If necessary, select the Design view in the DAC client. b. If necessary, select the Oracle 11.5.10 container from the list to the right of the Execute button. c. Click the Source System Parameters tab. d. Scroll or query to located the $$INITIAL_EXTRACT_DATE parameter. e. Display the Edit tab. f. If you were working in an editable custom container, you would enter an initial extract date in the Value field. You learn how to do this in the next set of practices. 2. Configure global currencies. Currency conversions are required because your business might have transactions involving multiple currencies. To create a meaningful report, you have to use a common currency. Out of the box, Oracle Business Intelligence Applications provides three global currencies, which are the common currencies used by the data warehouse. For example, if your organization is a multinational enterprise that has its headquarters in the United States, you probably want to choose US dollars (USD) as one of the three global currencies. The global currency is useful when creating enterprise-wide reports. a. In the Source Systems Parameters tab, scroll or query to locate the parameters used to set the global currencies and verify that the values are set according to the following table, where USD = US Dollars:
Oracle BI Applications 7.9: Implementation for Oracle EBS 77

Lesson 10: Configuring Analytical Applications

Parameter $$GLOBAL1_CURR_CODE $$GLOBAL2_CURR_CODE $$GLOBAL3_CURR_CODE

Value USD USD USD

b. Note that you must spell the currencies as they are spelled in your OLTP source system. 3. Configure exchange rate types. When Oracle Business Intelligence Applications converts your transaction records amount from document currency to global currencies, it also requires the exchange rate types to use to perform the conversion. For each of the global currencies, Oracle Business Intelligence Applications also allows you to specify the exchange rate type to use to perform the conversion. Oracle Business Intelligence Applications also provides three global exchange rate types for you to configure. a. In the Source Systems Parameters tab, scroll or query to locate the three parameters used to configure the global currency exchange rate types and verify the values are set according to the following table:
Parameter $$GLOBAL1_RATE_TYPE $$GLOBAL2_RATE_TYPE $$GLOBAL3_RATE_TYPE Value Corporate Corporate Corporate

b. Scroll or query to locate the $$DEFAULT_LOC_RATE_TYPE parameter, which is used to configure the exchange rate type for document currency to local currency conversion, and verify that it is set to Corporate. c. Note that you must spell the exchange rate type values as they are spelled in your OLTP source system. 4. Configure fiscal calendars. The default installation of Oracle Business Intelligence Applications supports one fiscal calendar. Fiscal data is first loaded in the W_DAY_D table and then the SIL mappings read data from W_DAY_D and load data into the aggregate fiscal time dimension tables such as Fiscal Week, Fiscal Month, Fiscal Quarter, and Fiscal Year. You may choose to provide fiscal calendar information in terms of the fiscal weeks of your organization or in terms of the fiscal months of your organization. In either case, the SIL mappings are designed to derive the Fiscal Week from the start date and end date of a fiscal month by grouping days into periods of seven days. In this example, you explore setting up fiscal calendars by fiscal month. a. Navigate to C:\OracleBI\dwrep\Informatica\SrcFiles and open fiscal_month.csv. Notice that the file contains the fiscal year, fiscal month, and the start date of the fiscal month in YYYYMMDD format. In your production environment, you would need to enter accurate fiscal data, as there is no check done with the Informatica mappings. Important: Please note that you are exploring this file in the C:\OracleBI\dwrep\Informatica\SrcFiles directory. Recall that in Practice 4-1 you copied this file and other .csv files from this directory to the \Informatica\PowerCenter8.1.1\server\infa_shared\SrcFiles directory. In your development and production environments, you would modify the .csv files in the \Informatica\PowerCenter8.1.1\server\infa_shared\SrcFiles directory so the files can be read by the Informatica mappings. This is true for this file and all other .csv files discussed in this set of practices.
78 Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 10: Configuring Analytical Applications

b. Close fiscal_month.csv without making any changes. c. Return to the DAC and click the Tasks tab. d. Query for the SIL_DayDimension_FiscalMonth_Extract task and verify that it is active (that the inactive box is not checked). This task is active because in this example you have chosen to provide fiscal information in terms of fiscal months. e. Query for the SIL_DayDimension_FiscalWeek_Extract and verify that it is inactive (that the inactive box is checked). This task is not active because in this example you have chosen to provide fiscal information in terms of fiscal months, not fiscal weeks. 5. Configure DATASOURCE_NUM_ID, which is a system column in the data warehouse that uniquely identifies a data source category and indicates which source systems the data comes from. Data sources that are supported by Oracle BI have predefined DATASOURCE_NUM_ID values. For example, the value 4 indicates an Oracle 11.5.10 data source and the value 1 indicates a Siebel source. a. In the DAC client, click the Setup view. b. Click the Physical Data Sources tab. c. Select the ORA_11_5_10 data source. d. In the Edit subtab, verify that the Data Source Number value is set to 4 for the Oracle 11.5.10 data source. 6. Leave the DAC open.

Oracle BI Applications 7.9: Implementation for Oracle EBS

79

Lesson 10: Configuring Analytical Applications

Practice 10-2: Configuring General Ledger Account Hierarchies


Goal To configure general ledger account hierarchies using general ledger accounting flexfield value set definitions There are two ways to set up hierarchies in Oracle Financial Analytics: by using flexfield value set definitions for general ledger accounting or by using the Financial Statement Generator (FSG) report definition. Whichever method you choose to set up general ledger account hierarchies, you store the hierarchy information in the W_HIERARCHY_D table. In this example, you explore the steps for setting up general ledger account hierarchies using general ledger accounting flexfield value sets definitions. For more information about using the Financial Statement Generator (FSG) report definition, refer to the Oracle Business Intelligence Applications Fusion Edition Installation and Configuration Guide. Setting up hierarchies using general ledger accounting flexfield value sets definitions requires establishing relationships between three data warehouse tables: W_HIERARCHY_D, which stores the hierarchies; W_GL_ACCOUNT_D, which stores general ledger accounts and general ledger code combinations; and W_GL_BALANCE_A, which stores general ledger account balances aggregated by general ledger account segment codes and segment attributes. Time 4050 minutes

Scenario

Instructions:
1. Explore general ledger account hierarchies in W_HIERARCHY_D, which is a generic hierarchy table, and which stores up to 20 hierarchies in a flattened structure. Columns HIER_CODE to HIER20_CODE and HIER_NAME to HIER20_NAME store the code/name values for each level within a hierarchy, including the leaf nodes. Each record stores the top node to leaf node path in the HIER_CODE to HIER20_CODE columns. The following screenshot shows a partial view of W_HIERARCHY_D:

80

Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 10: Configuring Analytical Applications

2. Explore the file_glacct_segment_config_ora.csv file. a. Navigate to C:\OracleBI\dwrep\Informatica\SrcFiles. b. Open the file file_glacct_segment_config_ora.csv. Before you run the ETL process for general ledger accounts, you need to specify the segments that you want to analyze. To specify the segments that you want to analyze, you use this ETL configuration file. The objective of this configuration file is to make sure that when segment information is extracted into the warehouse table W_GL_ ACCOUNT_D, segments with the same nature from different charts of accounts are stored in the same column in W_GL_ACCOUNT_D. Therefore, in this file, you need to specify the segments of the same nature in the same column. In this example, SEG1 stores Department information, SEG2 stores Account information, and SEG3 stores Company information. The screenshot shows a partial view of the file:

This means that in E-Business Suite, chart of accounts 101 has department information stored in SEGMENT2 and value set ID = 1002471. Chart of accounts 50214 has department information stored in SEGMENT4 and value set ID = 1002725. In this file, segments from both chart of accounts (101 and 50214) are stored in the same SEG1/SEG1_VALUESETID paired values. Thus, using this file allows you to uniformly store segments of the same nature in the same column in the data warehouse regardless of how they are stored in EBS. Because there are 30 paired segment columns in W_GL_ACCOUNT_D, you could proceed to add up to 30 paired values. However, you should only add as many as you need to analyze your facts by value set hierarchies. c. Explore W_GL_ACCOUNT_D. When ETL is run, all SEG1 segments (Department) from all chart of accounts will be stored in the ACCOUNT_SEG1* columns; all SEG2 segments (Account) from all chart of accounts will be stored in the ACCOUNT_SEG2* columns; all SEG3 segments (Company) from all chart of accounts will be stored in the ACCOUNT_SEG3* columns and so forth if you add more paired values. The screenshot shows a populated W_GL_ACCOUNT_D based on the information provided in the
Oracle BI Applications 7.9: Implementation for Oracle EBS 81

Lesson 10: Configuring Analytical Applications

file_glacct_segment_config_ora.csv file. As expected, segments with value set ID 1002471 are stored in ACCOUNT_SEG1* columns, and segments with value set ID 1002472 are stored in ACCOUNT_SEG2* columns, and so forth. The screenshot shows only a partial view of W_GL_ACCOUNT_D:

d. Scroll to the bottom of file file_glacct_segment_config_ora.csv. e. Notice that aggregation is set to Y for all segment columns. This means that in W_GL_BALANCE_A (where you store GL account balances at an aggregated level), you want to store GL account balances at the company, account, and department level instead of at the GL code combination level. f. Close file_glacct_segment_config_ora.csv without making any changes. g. In the DAC client, click the Subject Areas tab. h. Scroll or query to locate the Financials General Ledger subject area. i. With the Financials General Ledger subject area selected, click the Configuration Tags subtab. j. Verify that the tag Oracle Extract Value Set Hierarchies is active (that the inactive box is not checked). This tag is active because in this example you have chosen to set up hierarchies in Oracle Financial Analytics using the flexfield value set definitions for general ledger accounting. k. Verify that the tag Oracle Extract FSG Hierarchies is inactive (that the inactive box is checked). This tag is inactive because in this example you have chosen to set up hierarchies in Oracle Financial Analytics using the flexfield value set definitions for general ledger accounting, not using the Financial Statement Generator (FSG) report definition. Refer to the Oracle Business Intelligence Applications Fusion Edition Installation and Configuration Guide for more information about using the Financial Statement Generator (FSG) report definition to set up hierarchies. 3. Explore the hierarchy information in the physical layer of the preconfigured repository. a. Navigate to C:\OracleBI\server\Repository. b. Double-click OracleBIAnalyticsApps.rpd to open it in the Administration Tool. c. Log in as Administrator with password SADMIN. d. In the physical layer, expand Oracle Data Warehouse > Catalog > dbo. e. Scroll to locate Dim_W_HIERARCHY_D_ValueSet1. This table is used to establish the repository hierarchies based on the flexfield value set definitions you explored earlier. f. Double-click Dim_W_HIERARCHY_D_ValueSet1 to open the Alias Physical Table properties dialog box. Note that Dim_W_HIERARCHY_D_ValueSet1 is an alias table that points to the W_HIERARCHY_D table. (Recall that W_HIERARCHY_D is a generic hierarchy table. It stores hierarchies in a flattened structure. Columns HIER_CODE to
82 Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 10: Configuring Analytical Applications

g. h. i.

j. k.

HIER20_CODE and HIER_NAME to HIER20_NAME store the code/name values for each level within a hierarchy, including the leaf nodes. Each record stores the top node to leaf node path in the HIER_CODE to HIER20_CODE columns). Click Cancel to close the properties dialog box. Right-click Dim_W_HIERARCHY_D_ValueSet1 and select Physical Diagram > Object(s) and Direct Joins. Notice that Dim_W_HIERARCHY_D_ValueSet1 joins to Fact_Agg_W_GL_BALANCE_A, which is an alias that points to the W_GL_BALANCE_A_table, and Dim_W_GL_ACCOUNT_D, which is an alias that points to the W_GL_ACCOUNT_D table. Recall that W_GL_BALANCE_A stores general ledger account balances aggregated by general ledger account codes and segment attributes and W_GL_ACCOUNT_D stores general ledger accounts and general ledger code combinations. Double-click the connector between Dim_W_HIERARCHY_D_ValueSet1 and Dim_W_GL_ACCOUNT_D to open the Physical Foreign Key dialog box. Notice the join in the Expression field: Dim_W_HIERARCHY_D_ValueSet1.HIER20_CODE = Dim_W_GL_ACCOUNT_D.ACCOUNT_SEG1_CODE AND Dim_W_HIERARCHY_D_ValueSet1.HIER_CODE = Dim_W_GL_ACCOUNT_D.ACCOUNT_SEG1_ATTRIB

The code for the last level (the detail leaf level) in the hierarchy (HIER20_CODE) joins to the column in W_GL_ACCOUNT_D that stores the segment one code. The code for the highest level in the hierarchy (HIER_CODE) joins to column in W_GL_ACCOUNT_D that stores the segment one attribute. l. Click Cancel to close the Physical Foreign Key dialog box. m. Double-click the connector between Dim_W_HIERARCHY_D_ValueSet1 and Dim_W_GL_ACCOUNT_D to open the Physical Foreign Key dialog box. n. Notice the join in the Expression field: Dim_W_HIERARCHY_D_ValueSet1.HIER20_CODE = Fact_Agg_W_GL_BALANCE_A.ACCOUNT_SEG1_CODE AND Dim_W_HIERARCHY_D_ValueSet1.HIER_CODE = Fact_Agg_W_GL_BALANCE_A.ACCOUNT_SEG1_ATTRIB The code for the last level in the hierarchy (HIER20_CODE) joins to the column in W_GL_BALANCE_A that stores the segment one code. The code for the highest level in the hierarchy (HIER_CODE) joins to the column in W_GL_BALANCE_A that stores the segment one attribute. W_GL_BALANCE_A has only six segment columns. So, if you have more than six hierarchies, join only the first six to W_GL_BALANCE_A but join all hierarchies to W_GL_ACCOUNT_D in the previous step. o. Click Cancel to close the Physical Foreign Key dialog box. p. Close the Physical Diagram. 4. Explore the hierarchy information in the business model and mapping layer of the preconfigured repository.
Oracle BI Applications 7.9: Implementation for Oracle EBS 83

Lesson 10: Configuring Analytical Applications

a. In the Business Model and Mapping layer, expand Core and scroll to locate Dim GL ValueSetHierarchy1. b. Expand Dim GL ValueSetHierarchy1 > Sources and double-click Dim_W_HIERARCHY_D_ValueSet1 to open the logical table source dialog box. c. Click the Column Mapping tab and notice that the logical code/name columns map to the physical code/name columns in Dim_W_HIERARCHY_D_ValueSet1. d. Click the Content tab. e. In the WHERE clause filter, notice that a HIER_CODE filter is specified to restrain the output of the logical table to one hierarchy only:
"Oracle Data Warehouse".Catalog.dbo.Dim_W_HIERARCHY_D_ValueSet1.HIER_CODE = 1002470

Here, 1002470 is the value set hierarchy ID of the segment for which you are creating the hierarchy. In this example 1002470 corresponds to the highest level hierarchy in W_HIERARCHY_D.

f. Click Cancel to close the Logical Table Source dialog box. g. Right-click the Dim GL ValueSetHierarchy1 logical table (not the logical table source) and select Business Diagram > Selected Tables and Direct Joins to open the Logical Table Diagram, which displays all the logical fact tables that have logical join to the logical hierarchy table Dim GL ValueSetHierarchy1. h. Close the Logical Table Diagram. i. Close OracleBIAnalyticsApps.rpd without saving. j. Close the Oracle BI Administration Tool.

84

Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 10: Configuring Analytical Applications

Practice 10-3: Mapping Oracle GL Natural Accounts to Group Account Numbers


Goal Scenario To map Oracle general ledger natural accounts to group account numbers Oracle EBS General Ledger (GL) does not contain business attributes that represent real world entities such as supplier, customer, employee, and so on. This information resides in the subledgers, for example, supplier dimension in accounts payables (AP) and customer dimension in account receivables (AR). In Oracle GL, the transactions are tracked at an account level and used more for bookkeeping purposes. Therefore, in order to facilitate reporting on the GL transactions in a data warehouse environment, OBIEE Financial Applications uses group account number to categorize the accounting entries. It is critical that general ledger account numbers are mapped to the group account numbers (or domain values) because the group account number is used during data extraction as well as front-end reporting. Time 1015 minutes

Instructions:
1. Explore W_GL_ACCOUNT_D, which is the data warehouse dimension table that stores all the general ledger accounts associated with any organization. The GROUP_ACCOUNT_NUM field in W_GL_ACCOUNT_D denotes the nature of the general ledger accounts (for example, cash account or payroll account). The screenshot provides a partial snapshot of W_GL_ACCOUNT_D:

2. Explore the file_group_acct_names.csv file. a. Navigate to C:\OracleBI\dwrep\Informatica\SrcFiles and open file_group_acct_names.csv. This file provides a list of group account numbers that you can use. For example, AP equals ACCOUNTS PAYABLES and AR equals ACCOUNTS

Oracle BI Applications 7.9: Implementation for Oracle EBS

85

Lesson 10: Configuring Analytical Applications

RECEIVABLE. The screenshot provides only a partial view of file_group_acct_names.csv:

b. Close file_group_acct_names.csv without making any changes. 3. Explore the data model reference. A list of domain values for general ledger account numbers is also provided in the Oracle Business Analytics Warehouse Data Model Reference. It includes the financial statement item code, which determines the nature of the account. The screenshot shows only a partial view of the table:

4. Explore the file_group_acct_codes_ora.csv file. a. Navigate to C:\OracleBI\dwrep\Informatica\SrcFiles. b. Open file_group_acct_codes_ora.csv. This file provides the logic for assigning accounts. Each row maps all accounts within the specified account number range and within the given chart of account ID to a group account number. For example, in the partial view of the file in the screenshot below, all accounts within the account number range from 4110 to 4110 that have a chart of accounts ID equal to 101 are assigned to the Revenue group account number. All accounts within the account number range from 5110 to 5110 that have a chart of accounts ID equal to 101 are assigned to Cost of Goods Sold (COGS) group account
86 Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 10: Configuring Analytical Applications

number, and so forth.

If you need to create a new group of account numbers, you can create new rows in the file_group_acct_names.csv file. You can then assign GL accounts to the new group of account numbers in the file_group_acct_codes_ora.csv file. c. Close file_group_acct_codes_ora.csv without making any changes. 5. Explore the file_grpact_fstmt.csv file a. Open the C:\OracleBI\dwrep\Informatica\SrcFiles\file_grpact_fstmt.csv file. If you create a new group account number, add a new row to this file. This file specifies the relationship between a group account number and a financial statement item code. You must map the new group account number to one of the following financial statement item codes: AP, AR, COGS, REVENUE, TAX, OTHERS. You must also provide the general ledger account category code (GL_ACCOUNT_CAT_CODE), which determines whether the account is a balance sheet account (BS) or a profit and loss account (PL). By mapping your GL accounts against the group account numbers and then associating the group account number to a financial statement item code, you indirectly associate the GL account numbers to the financial statement item codes as well. The screenshot shows a partial view of the file_grpact_fstmt.csv file:

These financial statement item codes correspond to the following six base fact tables in the Financial Analytics product: AP base fact (W_AP_XACT_F) AR base fact (W_AR_XACT_F) Revenue base fact (W_GL_REVN_F) Cost of Goods Sold base fact (W_GL_COGS_F) Tax base fact (W_TAX_XACT_F) GL Journal base fact (W_GL_OTHER_F)
Oracle BI Applications 7.9: Implementation for Oracle EBS 87

Lesson 10: Configuring Analytical Applications

b. Close file_grpact_fstmt.csv without making any changes.

88

Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 10: Configuring Analytical Applications

Practice 10-4: Creating a New Metric Based on a New Group Account Number
Goals Scenario To create a new metric in the repository after adding a new group account number When you add a new group account number in file_group_acct_codes_ora.csv, it does not automatically show up in your reports. In the Oracle BI repository you need to create a new metric that maps to the new group account number. 1015 minutes

Time

Instructions:
1. Explore how measures are mapped to group account numbers in the OBI repository. a. Navigate to
C:\Informatica\PowerCenter8.1.1\server\infa_shared\SrcFiles.

b. Open file_group_acct_codes_ora.csv. c. Scroll to line 11 and notice that there is a group account number named PPAID EXP, which stands for prepaid expenses.

d. Leave the file open. e. Select Start > Programs > Oracle Business Intelligence > Administration to open the Administration Tool. f. Select File > Open > Offline. g. Select OracleBIAnalyticsApps.rpd and click Open. h. Log in as Administrator with password SADMIN. The repository is large, so it may take a moment to open. i. In the Presentation layer, expand Financials GL Balance Sheet > Facts Balance Sheet Statement. j. Right-click Prepaid Expenses and select Display Related > Logical Column to open the Query Repository dialog box. k. Select the Prepaid Expenses logical column in the list and click Go To. The Prepaid Expenses logical column is displayed in the Business Model and Mapping layer. l. Double-click the Prepaid Expenses logical column in the Business Model and Mapping layer to open the Logical Column dialog box.
Oracle BI Applications 7.9: Implementation for Oracle EBS 89

Lesson 10: Configuring Analytical Applications

m. If necessary, click the General tab and notice that the corresponding fact table (GL Balance fact) in the BMM layer joins with the GL Account Dimension and filters for those GL accounts that belong to the group account number PPAID EXP. Thus, the metric Prepaid Expenses is the total amount coming from accounts which have a group account number equal to PPAID EXP. n. Click Cancel to close the Logical Column dialog box. 2. Create a new group account number in file_group_acct_codes_ora.csv. a. Return to file_group_acct_codes_ora.csv and insert a new row below the first row with the following data:

b. Save and close file_group_acct_codes_ora.csv. Click Yes if prompted to keep the CSV format. c. In the Business Model and Mapping layer, right-click the Prepaid Expenses measure and select Duplicate. d. Scroll down, locate the duplicate measure Prepaid Expense#1, and rename it to TEST. e. Double-click TEST to open the Logical Column dialog box.

90

Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 10: Configuring Analytical Applications

f. In the filter, replace PPAID EXP with TEST. The metric TEST is now the total amount coming from accounts that have a group account number equal to TEST.

g. Click OK to close the Logical Column dialog box. h. Drag the new TEST measure to Financials GL Balance Sheet > Facts Balance Sheet Statement in the Presentation layer. The new measure is now available to be viewed in Oracle BI Presentation Services requests and dashboards. i. Save the repository. Do not check global consistency. j. Close the repository and the Administration Tool. 3. Running a full ETL. Typically at this point, you would use the DAC to run the Financials_Oracle 11.5.10 execution plan, which would extract data from the Oracle E-Business Suite source system, transform the data via Informatica mappings, and load the data into the appropriate tables in the Oracle Business Analytics Data Warehouse. However, running an initial extract for Financials_Oracle 11.5.10 takes 4-5 hours, which is prohibitive given the time constraints of this training. Therefore, you run ETL for a custom execution plan in the next set of practices. That execution plan completes in approximately 90 minutes.

Oracle BI Applications 7.9: Implementation for Oracle EBS

91

Lesson 10: Configuring Analytical Applications

92

Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 11: Customizing DAC Metadata and Running an Execution Plan

Practice 11-1: Customizing DAC Metadata


Goals To customize the DAC by creating a custom container, a new subject area, and an execution plan Oracle Business Intelligence Applications provides preconfigured subject areas and execution plans. You can change these preconfigured objects or create new objects to correspond to your particular business requirements. After examining your business requirements, you determine that the only subject area you want to analyze is Financials Revenue. Please notice that a preconfigured Financials Revenue subject area already exists. But for training purposes, you create and run a custom subject area and execution plan limited to the Financials - Revenue fact table. Time 7090 minutes (depending on how much time the ETL takes to run)

Scenario

Instructions:
1. Create a copy of a preconfigured source system container. The DAC metadata for a source system is held in a container. You cannot change the metadata for preconfigured containers. If you want to customize the metadata in a preconfigured container, you must first make a copy of the container. The DAC keeps track of all customizations in the copied container, so that at any time you can find the newly created objects and modified objects, as well as the original objects. a. Return to the DAC client, which should still be open. If not, select Start > Programs > Oracle Business Intelligence > Oracle DAC > DAC Client and log in to the DAC connection using dac as the table owner name and password. b. Select File > New Source System Container. c. In the ID field, enter 01. d. In the Name field, enter Custom. e. Select Create as a Copy of Existing Container. f. Select the Oracle 11.5.10 container from the list and click OK. It should take about five minutes to complete. When you see the success message, click OK. 2. Configure the initial extract date. a. If necessary, select the Design view in the DAC client. b. If necessary, select the Custom container from the list to the right of the Execute button. c. Click the Source System Parameters tab. d. Scroll or query to locate the $$INITIAL_EXTRACT_DATE parameter. e. Display the Edit subtab. f. Click the check icon in the Value field to open the Enter Parameter Value dialog box. g. In the Date field, click the calendar icon to open the Date dialog box. h. Enter December 1, 2003, Hour: 12 AM, Minute: 0, Second: 0.
Oracle BI Applications 7.9: Implementation for Oracle EBS 93

Lesson 11: Customizing DAC Metadata and Running an Execution Plan

i. j. k. l.

Click OK to close the Date dialog box. Click OK to close the Enter Parameter Value dialog box. Click Save. The Updating dialog box opens. Click Yes to confirm that you want to proceed.

3. Create a custom subject area. a. Select the Subject Areas tab. b. Click New. c. In the Edit subtab, enter Custom - Revenue as the name of the subject area and click Save. d. Select the Tables subtab. e. Click Add/Remove. f. In the Choose Tables dialog box, query for W_GL_REVN_F. g. Click Add to add W_GL_REVN_F to the right-hand list of tables that belong to the custom subject area. h. Click OK in the Adding window to acknowledge that W_GL_REVN_F is added to the subject area. i. Notice that in the left-hand list of the Choose Tables dialog box, the W_GL_REVN_F table appears in a green italic font to indicate that the DAC object is a referenced object, in this case owned by the Oracle 11.5.10 container. In the right-hand list, the W_GL_REVN_F table appears in standard font. This indicates that the object is not a reference or a cloned object but is unique to the current container. A cloned object is a referenced object that has been modified in the referencing container. j. Click OK to close the Choose Tables dialog box. k. Click Save. 4. Assemble the subject area. The DAC assembles a subject area by determining what dimensions and other related tables are required and what tasks are needed to load these tables. a. Click the Tasks subtab and notice that, because you have never assembled the Custom Revenue subject area, the subject area currently has no tasks associated with it. Also notice that the Last Designed attribute of the Custom Revenue subject area is null. This attribute stores the time stamp for the most recent assembly of a subject area. b. Verify that the Custom Revenue subject area is selected and click Assemble. c. In the Assembling window, select the Selected record only option and click OK. The assembly of subject area takes several minutes. d. Click OK to acknowledge that the subject area has been successfully assembled. e. Click Refresh in the Tasks subtab to refresh the list of tasks associated with the custom subject area. In the upper-right corner of the Task subtab, verify that 1 of 214 is displayed, indicating that there are 214 tasks associated with this subject area. f. Scroll through the list and notice that the DAC has assembled a list of all the tasks that prepare, extract, and load the W_GL_REVN_F fact table and all of its related tables. Notice that all tasks generated by the assembly of the subject area are marked as Autogenerated, which identifies tasks that are added by the DAC. g. Notice also that the Last Designed attribute for the Custom Revenue subject area is now populated.
94 Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 11: Customizing DAC Metadata and Running an Execution Plan

5. Create a custom execution plan and add the Custom Revenue subject area to it. a. Click the Execute button to open the Execute view. b. Click New. c. In the Edit subtab, enter Custom Revenue as the name of the execution plan and click Save. d. Select the Subject Areas child tab. e. Click Add/Remove. f. In the table list on the left, select the Custom Revenue subject area and click Add to add it to the list of subject areas on the right belonging to the custom execution plan. g. Click OK in the Adding window to acknowledge that the Custom Revenue subject area is added to the execution plan. h. Click OK to close the Choose Subject Areas dialog box. 6. Generate execution plan parameters for the Custom Revenue execution plan that you have created. At run time, the DAC Server compiles a file for each Informatica DAC task that contains Informatica Server parameters and writes it to the c:\ \Informatica\PowerCenter8.1.1\server\infa_shared\SessLogs directory. This file includes generic as well as task-level parameters set at the task object level, and the execution planlevel parameters that specify connection details for the transactional database sourced by the execution plan as well as the warehouse that it targets. The tasks parameter file, once generated, is referenced in the PMCMD command issued for the task during the execution of an execution plan. a. In the Execution Plan list, select the Custom Revenue execution plan. b. Select the Parameters child tab and click Generate. c. Click Yes to proceed. A list of seed parameters, which are to be updated, is generated. d. Click OK to acknowledge that new values have been set for the execution plan parameters. Execution plan parameters include the Informatica repository folders related to the container that the execution plan belongs to, as well as data source connections. You need only update the pertinent connection parameters. The parameter names are generated as they will appear in the parameter file generated at run time. Recall that, as previously noted , the connection details specified in the execution plan parameters must match physical data sources specified in the Setup view, which, in turn, must match the source and target relational connections configured in the Informatica repository. e. Enter the following values to update the DATASOURCE parameters. If you need to verify that the values are correct, you can navigate to the Setup view and check the names of the Source and Warehouse type connections.
DBConnection_OLAP DBConnection_OLTP FlatFileConnection DataWarehouse ORA_11_5_10 ORA_11_5_10_Flatfile

f. Click Save. 7. Build the execution plan. When you build an execution plan, the DAC compiles and sets precedence for the tasks required to load the subject areas included in the plan. a. Click the Ordered Tasks child tab for the Custom Revenue execution plan and notice that, because you have never built the execution plan, it currently has no tasks associated
Oracle BI Applications 7.9: Implementation for Oracle EBS 95

Lesson 11: Customizing DAC Metadata and Running an Execution Plan

b. c. d.

e. f.

g.

h.

with it. Also, notice that the Last Designed attribute of the execution plan is null. As with subject areas, this attribute stores the time stamp indicating when the execution plan was last built. Verify that the Custom Revenue execution plan is selected and click Build. Accept the default and click OK in the Building window. In the second Building window, again accept the default to perform the operation for the selected record only and click OK. As with assembling a subject area, building the execution plan takes several minutes. Click OK to acknowledge that the Custom Revenue execution plan has been successfully built. Click Refresh in the Ordered Tasks subtab to refresh the list of tasks associated with the execution plan. In the upper-right corner of the Ordered Tasks subtab, verify that 1 of 214 is displayed, indicating that there are 214 tasks associated with this execution plan. Scroll through the list and notice that the DAC has assembled a list of all the tasks that prepare, extract, and load the tables belonging to the subject areas included in the execution plan. Also, notice that each task is assigned a task depth, indicating its execution precedence at plan run time. Tasks that have no dependencies have a depth of 0. Tasks that depend on other tasks having a depth of 0 have a depth of 1, and so on. Notice that a time stamp now appears in the Last Designed column for the execution plan.

8. Verify that the DAC Server is started. a. Verify that the DAC Server Monitor icon in the upper-right corner of the DAC client resembles an orange electrical plug in a socket, which means that the client is connected to the server and that the server is idle. When you mouse over the orange icon it should say DAC Server is idle. When the DAC client cannot establish a connection to the DAC Server, the Server Monitor icon resembles a red electrical plug. If the client is connected to a server that is running an ETL process, the icon resembles a green electrical plug with a lightning sign superimposed on it. b. If the DAC Server is not started, select Start > Programs > Oracle Business Intelligence > Oracle DAC > Start DAC Server and verify that the DAC Server Monitor icon resembles an orange electrical plug in a socket before continuing. 9. Run the custom execution plan. a. In the Execution Plans tab, select the Custom Revenue execution plan. b. Click the Run Now button in the Top Pane toolbar. c. In the Starting ETL dialog box, click Yes to confirm that you want to start the execution plan. d. Click OK to acknowledge that the request has been successfully submitted to the Informatica Server. 10. Monitor the ETL plan execution. a. Select the Current Run tab. b. Select the Custom Revenue run and confirm that it has a run status of Running. Notice that the DAC Server Monitor icon has changed from yellow to green, indicating that a plan is being executed. c. Click Auto Refresh and set the automatic refresh frequency to 30 seconds.
96 Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 11: Customizing DAC Metadata and Running an Execution Plan

d. Select the Tasks subtab to view task statuses within the execution plan. e. Use the list to view different task statuses: All, Running, Queued, and so on. 11. View the run history for the execution plan. a. When all tasks have completed (after approximately 60 minutes), select the Run History tab. b. Select the Custom Revenue execution plan that you just ran. c. Verify that Run Status = Completed, Status Description = Finished, Number of Failed Tasks = 0, and Number of Successful Tasks = 214. If the execution plan fails: When an execution plan is executed and a task fails, the status of the tasks that are dependent on the failed task is changed to Stopped. While tasks are still running, the execution plans status is Running. When all the tasks have been run, and if one or more tasks have failed, the execution plans status is changed to Failed. You can check the tasks that have failed in the Current Run tab of the Execute view, fix the problems, and then re-queue the failed tasks by changing the status to Queued. You can then restart the ETL. All the tasks will then be rerun. You can also manually run a task, change its status to Completed, and then restart the ETL. Tasks with a Completed status are skipped. If you need assistance, ask your instructor. d. Right-click the Custom Revenue run, and select Get Run information > Get log file. e. In the Input dialog box, either accept the file name or modify it and click OK. f. In the Fetching log file dialog box, notice the path that the log file has been saved to and click OK. g. In Windows Explorer, navigate to C:\OracleBI\DAC\ServerLog and open the Custom Revenue.x.log file, where x is the process ID number. h. Scroll through the file and confirm that there are no steps listed under Failed Sessions or Queued Sessions. i. Close the log file. 12. Query the data warehouse to verify that tables contain data. a. Select Start > Programs > Oracle OraDb11g_home1 > Application Development > SQL Plus. b. Enter obaw as the user name. c. Enter obaw as the password. d. Enter the following SQL query to verify that the W_GL_REVN_F table contains data:
SELECT COUNT (*) from W_GL_REVN_F;

e. Verify that 105882 rows are returned. f. Close SQL*Plus.

Oracle BI Applications 7.9: Implementation for Oracle EBS

97

Lesson 11: Customizing DAC Metadata and Running an Execution Plan

98

Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 13: Adding Columns to an Existing Dimension Table

Practice 13-1: Creating a Custom SDE Mapping


Goals To create a source dependent extract (SDE) mapping that can be used to move data from a column in a source system into the W_ORG_DS staging extension table in preparation for loading it into the W_ORG_D table in the data warehouse In this type 1 customization, you extract data from a column in a source system and load the data into an existing data warehouse table. In this example, your company has used the ATTRIBUTE5 column in the HZ_CUST_ACCOUNTS table in the EBS source system to capture data related to accounts. You want to extract the data from this column and ultimately load it into a custom column in the organization dimension table, W_ORG_D, in the data warehouse. The first step is to build a custom SDE mapping. To build the custom SDE mapping, you copy an existing mapping and workflow into a custom folder in Informatica and then modify the mapping and workflow. 1520 minutes

Scenario

Time

Instructions:
1. Use SQL*Plus to examine data in the source table. a. Select Start > Programs > Oracle OraDb11g_home > Application Development > SQL Plus to start SQL*Plus. b. Log in as biapps with password biapps. c. Run the following select statement:
select count(*) from HZ_CUST_ACCOUNTS where ATTRIBUTE5 is NOT NULL;

d. Verify that 27 rows are returned. After you run ETL, you verify that the data warehouse target table, W_ORG_D, is populated with these rows. e. Close SQL*Plus 2. Copy an existing mapping and workflow to a custom folder. a. If necessary, open Informatica Repository Manager. b. Connect to the Oracle_BI_DW_Base repository. c. Navigate to SDE_ORA11510_Adaptor > Mappings. d. Select the SDE_ORA_OrganizationDimension_Customer mapping. e. Copy the SDE_ORA_OrganizationDimension_Customer mapping and paste it into the CUSTOM_SDE folder. f. Click Yes when asked if you want to copy SDE_ORA_OrganizationDimension_Customer. g. Verify that the SDE_ORA_OrganizationDimension_Customer mapping is visible in CUSTOM_SDE > Mappings. h. Navigate to SDE_ORA11510_Adaptor > Workflows. i. Copy the SDE_ORA_OrganizationDimension_Customer workflow and paste it into the CUSTOM_SDE folder. Typically, you would create two workflows, one to be used for a full load and the other to be used for an incremental load. Both workflows are based on the same
Oracle BI Applications 7.9: Implementation for Oracle EBS 99

Lesson 13: Adding Columns to an Existing Dimension Table

j. k. l. m. n.

mapping, which is executed during both full and incremental loads. This provides an opportunity to tune each of these load scenarios. For this training, you copy and create only one workflow for all the mappings in this set of practices. Click Yes to confirm the copy. In the Copy Wizard, select Reuse and Apply this resolution to > All conflicts. Click Next. In the Copy Summary window, click Finish. Verify that the SDE_ORA_OrganizationDimension_Customer workflow is visible in CUSTOM_SDE > Workflows.

3. Edit the target definition to include the required column. a. In Repository Manager, select Tools > Designer to open Informatica Designer. b. Verify that the CUSTOM_SDE folder is open in the repository navigator in Informatica Designer. c. Select Tools > Target Designer. d. Navigate to CUSTOM_SDE > Targets > W_ORG_DS. e. Drag W_ORG_DS into the Target Designer window. f. Double-click W_ORG_DS in the Target Designer window to open the Edit Tables dialog box. g. Click the Columns tab. h. Scroll to the bottom and select the last column in the list, X_CUSTOM. i. Click the Add a new column to this table button, which creates a NEWFIELD column. j. Change the name of the NEWFIELD column to X_ACCOUNT_LOG and verify that the data type is set to varchar2(10). k. Click Apply. l. Click OK to close the Edit Tables dialog box. 4. Map the column in the mapplet. a. Select Tools > Mapping Designer. b. Navigate to CUSTOM_SDE > Mappings. c. Drag the SDE_ORA_OrganizationDimension_Customer mapping into the Mapping Designer. d. Right-click the mplt_BC_ORA_OrganizationDimension_Customer mapplet and select Open Mapplet. e. In the Mapplet Designer, locate the HZ_CUST_ACCOUNTS source definition. f. Drag the ATTRIBUTE5 column from the HZ_CUST_ACCOUNTS source definition to a blank port in the SQ_BCI_CUSTOMERS source qualifier. g. Drag the ATTRIBUTE5 column from the SQ_BCI_CUSTOMERS source qualifier to a blank port in the EXP_CUSTOMERS expression. h. Drag the ATTRIBUTE5 column from the EXP_CUSTOMERS expression to a blank port in the MAPO_CUSTOMERS output transformation. 5. Edit the SQL override in the source qualifier. a. Double-click the SQ_BCI_CUSTOMERS source qualifier.
100 Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 13: Adding Columns to an Existing Dimension Table

Click the Ports tab. Scroll to the bottom and verify that the ATTRIBUTE5 port has been added. Click the Properties tab. Click the down arrow in the Value field for the SQL Query transformation attribute to open the SQL Editor. f. Scroll down and add the HZ_CUST_ACCOUNTS.ATTRIBUTE5 column immediately after the last column in the SELECT clause. Hint: Use the Ports tab on the left to add the column. Be sure to add a comma before HZ_CUST_ACCOUNTS.ATTRIBUTE5. b. c. d. e.

g. Click OK to close the SQL Editor. h. Click Apply and OK in the Edit Transformations dialog box. 6. Validate the mapplet. a. If necessary, select View > Output to view the Output window. b. Select Mapplets > Validate to verify that there are no inconsistencies in the mapping. You should get the message Mapplet mplt_BC_ORA_OrganizationDimension_Customer is VALID. c. If your mapplet is valid, select Repository > Save to update the repository. 7. Create a new custom expression transformation in the mapping. a. Return to the mapping by selecting Tools > Mapping Designer or by clicking the Mapping Designer icon on the toolbar. b. Select Transformation > Create. c. Select Expression in the list. d. Enter X_CUSTOM as the name. e. Click Create. f. Click Done. 8. Map the column in the mapping. a. Drag the ATTRIBUTE5 column from the mplt_BC_ORA_OrganizationDimension_Customer mapplet to the X_CUSTOM expression. b. Double-click the X_CUSTOM expression to open the Edit Transformations dialog box. c. Click the Ports tab. d. As a best practice, rename the port to indicate both the table and column it comes from: HZ_CUST_ACCOUNTS_ATTRIBUTE5. If the mapping is changed and the related exposed objects are replaced, this will make it easier to reconnect, because the custom expression will not be replaced. e. Click OK to close the Edit Transformations dialog box. f. Drag HZ_CUST_ACCOUNTS_ATTRIBUTE5 from the X_CUSTOM expression to the X_ACCOUNT_LOG port in the W_ORG_DS target definition. 9. Validate your work and update the repository.
Oracle BI Applications 7.9: Implementation for Oracle EBS 101

Lesson 13: Adding Columns to an Existing Dimension Table

a. If necessary, select View > Output to view the Output window. b. Click the Validate tab of the Output window and select Mappings > Validate to verify that there are no inconsistencies in the mapping. You should get the message Mapping SDE_ORA_OrganizationDimension_Customer is VALID. c. If your mapping is valid, select Repository > Save to update the repository. 10. Edit the workflow. a. Select Tools > Workflow Manager. b. Verify that the CUSTOM_SDE folder is open. c. If necessary, in Workflow Manager, select Tools > Workflow Designer to open Workflow Designer. d. Expand CUSTOM_SDE > Workflows. e. Drag the SDE_ORA_OrganizationDimension_Customer workflow into the Workflow Designer window. f. In the Workflow Designer window, double-click the SDE_ORA_OrganizationDimension_Customer task to open the Edit Tasks dialog box. g. In the General tab, verify that the Fail parent if this task fails and Fail parent if this task does not run options are selected. h. Click the Properties tab. i. Change the session log file name to CUSTOM_SDE.SDE_ORA_OrganizationDimension_Customer.log. j. Change the parameter file name to CUSTOM_SDE.SDE_ORA_OrganizationDimension_Customer.txt. k. Verify that $Source connection value is set to $DBConnection_OLTP. l. Verify that $Target connection value is set to $DBConnection_OLAP. m. Click the Config Object tab. n. For the Stop on errors attribute, enter a value of 1. o. Click the Mapping tab. p. In the left pane, select Sources > mplt_BC_ORA_OrganizationDimension_Customer.SQ_BCI_CUSTOMERS. q. Verify that the connection value is set to $DBConnection_OLTP. r. For Targets > W_ORG_DS, verify that the connection value is set to $DBConnection_OLAP. s. Change the Target load type attribute from Bulk to Normal. t. Click Apply. u. Click OK. v. Click the Validate tab in the Output Window. w. Select Workflows > Validate. x. You should get the message Workflow SDE_ORA_OrganizationDimension_Customer is VALID. y. If your workflows are valid, select Repository > Save to update the repository.

102

Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 13: Adding Columns to an Existing Dimension Table

Practice 13-2: Creating a Custom SIL Mapping


Goals Scenario To create a source independent load (SIL) mapping that can be used to move data into the W_ORG_D table from the W_ORG_DS staging table You have built an SDE mapping to extract data from the source system and load it into the W_ORG_DS staging table in the data warehouse. You now create an SIL mapping to move the data from the W_ORG_DS staging table into the W_ORG_D dimension table. To build the SIL mapping, you copy an existing mapping and workflow into a custom folder in Informatica and then modify the mapping and workflow. 1520 minutes

Time

Instructions:
1. Create a custom Informatica repository folder and copy the SIL mapping and workflow needed for customization. a. Return to Informatica Repository Manager, which should still be open. b. Select Oracle_BI_DW_Base in the Repository Navigator. c. Select Folder > Create. d. In the Create Folder dialog box, name the folder CUSTOM_SILOS. e. Click OK. f. Click OK to confirm that the folder was successfully created. g. Navigate to SILOS > Mappings > SIL_OrganizationDimension. h. Copy the SIL_OrganizationDimension mapping and paste it into the CUSTOM_SILOS folder. i. Click Yes when asked if you want to copy. j. Navigate to SILOS > Workflows > SIL_OrganizationDimension. k. Copy the SIL_OrganizationDimension workflow and paste it into the CUSTOM_SILOS folder. l. Click Yes when asked if you want to copy. m. In the Copy Wizard, select Reuse and Apply this resolution to > All Conflicts. n. Click Next. o. In the Copy Summary window, click Finish. 2. Edit the source definition to include the required columns. a. Select Tools > Designer to return to Informatica Designer. b. If you do not see the new CUSTOM_SILOS folder, right-click Oracle_BI_DW_Base, select Disconnect, and then reconnect as Administrator with password Administrator. c. Verify that the CUSTOM_SILOS folder is visible in the repository navigator. d. Open CUSTOM_SILOS. e. Select Tools > Source Analyzer.
Oracle BI Applications 7.9: Implementation for Oracle EBS 103

Lesson 13: Adding Columns to an Existing Dimension Table

f. Navigate to CUSTOM_SILOS > Sources > OLAP > W_ORG_DS. g. Drag W_ORG_DS into the Source Analyzer window. h. Double-click W_ORG_DS in the Source Analyzer window to open the Edit Tables dialog box. i. Click the Columns tab. j. Scroll to the bottom and select the last column in the list, X_CUSTOM. k. Click the Add a new column to this table button, which creates a NEWFIELD column. l. Rename the NEWFIELD column to X_ACCOUNT_LOG and verify that the data type is set to varchar2(10). m. Click Apply. n. Click OK. 3. Edit the target definition to include the required columns. a. Select Tools > Target Designer. b. Navigate to CUSTOM_SILOS > Targets > W_ORG_D. c. Drag W_ORG_D into the Target Designer window. d. Double-click W_ORG_D in the Warehouse Designer window to open the Edit Tables dialog box. e. Click the Columns tab. f. Scroll to the bottom and select the last column in the list, X_CUSTOM. g. Click the Add a new column to this table button, which creates a NEWFIELD column. h. Rename the NEWFIELD column to X_ACCOUNT_LOG and verify that the data type is set to VARCHAR2(10). i. Click Apply. j. Click OK. 4. Edit the SIL mapping. a. Select Tools > Mapping Designer. b. Navigate to CUSTOM_SILOS > Mappings > SIL_OrganizationDimension. c. Drag the SIL_OrganizationDimension mapping into the Mapping Designer window. d. Drag the X_ACCOUNT_LOG column from the W_ORG_DS source definition to the blank port below the X_CUSTOM port in the Sq_W_ORG_DS source qualifier. e. Locate and double-click the Fil_W_ORG_D filter. f. Click the Ports tab. g. Scroll to the bottom and select the X_CUSTOM port. h. Click the button to add a new port. i. Name the port X_ACCOUNT_LOG and verify Prec is set to 10. j. Click Apply and OK. k. Drag the X_ACCOUNT_LOG column from the Sq_W_ORG_DS source qualifier to the corresponding port in the Fil_W_ORG_D filter. l. Drag the column X_ACCOUNT_LOG from the Fil_W_ORG_D filter to a blank port in the EXP_Custom expression. m. Drag the column X_ACCOUNT_LOG from the EXP_Custom expression to a blank port in the Upd_W_ORG_D_Ins_Upd update strategy.
104 Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 13: Adding Columns to an Existing Dimension Table

n. Drag the column X_ACCOUNT_LOG from the Upd_W_ORG_D_Ins_Upd update strategy to the corresponding column in the W_ORG_D target definition. 5. Edit the SQL override in the source qualifier. a. Double-click the Sq_W_ORG_DS source qualifier. b. Click the Properties tab. c. Click the down arrow in the Value field for the SQL Query transformation attribute to open the SQL Editor. d. Add the W_ORG_DS.X_ACCOUNT_LOG column immediately after W_ORG_DS.X_CUSTOM in the SELECT clause. Hint: Use the Ports tab in the left pane to add the column. Be sure to add a comma after W_ORG_DS.X_CUSTOM. e. Click OK to close the SQL Editor. f. Click Apply and OK in the Edit Transformations dialog box. 6. Validate your work and update the repository. a. If necessary, select View > Output to view the Output window. b. Click the Validate tab of the Output window and select Mappings > Validate to verify that there are no inconsistencies in the mapping. You should get the message Mapping SIL_OrganizationDimension is VALID. c. If your mapping is valid, select Repository > Save to update the repository. 7. Edit the workflow. a. Select Tools > Workflow Manager. b. In order to see the new CUSTOM_SILOS folder, right-click Oracle_BI_DW_Base, select Disconnect, and then reconnect as Administrator with password Administrator. c. Verify that the CUSTOM_SILOS folder is visible. d. Open the CUSTOM_SILOS folder. e. Select Tools > Workflow Designer. f. Navigate to CUSTOM_SILOS > Workflows > SIL_OrganizationDimension. g. Drag the SIL_OrganizationDimension workflow into the Workflow Designer window. h. Double-click the SIL_OrganizationDimension task. i. In the General tab, verify that the Fail parent if this task fails and Fail parent if this task does not run options are selected. j. Click the Properties tab. k. Change the session log file name to CUSTOM_SILOS.SIL_OrganizationDimension.log. l. Change the parameter file name to CUSTOM_SILOS.SIL_OrganizationDimension.txt. m. Verify that $Source connection value is set to $DBConnection_OLAP. n. Verify that $Target connection value is set to $DBConnection_OLAP. o. Click the Config Object tab. p. For the Stop on errors attribute, enter a value of 1. q. Click the Mapping tab. r. For Sources > Sq_W_ORG_DS, verify that the connection value is set to $DBConnection_OLAP.
Oracle BI Applications 7.9: Implementation for Oracle EBS 105

Lesson 13: Adding Columns to an Existing Dimension Table

s. For Targets > W_ORG_D, verify that the connection value is set to $DBConnection_OLAP. t. Verify that the Target load type is set to Normal. u. Click Apply. v. Click OK. w. Select Workflows > Validate. x. You should get the message Workflow SIL_OrganizationDimension is VALID. y. If your workflow is valid, select Repository > Save to update the repository. z. Close all open Informatica applications.

106

Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 13: Adding Columns to an Existing Dimension Table

Practice 13-3: Adding DAC Tasks and Running Customized ETL


Goals To configure the DAC to extract data from the source and load it into the custom column in the data warehouse You have built the sessions and workflows in Informatica Workflow Manager to run the SDE and SIL mappings. Now you must modify DAC tasks, add them to a custom subject area and execution plan, and run the ETL to load the data into the custom column in the W_ORG_D dimension table in the data warehouse. 4045 minutes

Scenario

Time

Instructions:
1. If necessary, log in to the DAC client. a. Select Start > Programs > Oracle Business Intelligence > Oracle DAC > DAC Client. b. Log in as dac with password dac. 2. Add the new column object for the W_ORG_DS and W_ORG_D tables to the DAC. Recall that when you built the SDE and SIL mappings in the two previous practices, you added the X_ACCOUNT_LOG column to the W_ORG_DS and W_ORG_D tables. Now you must add this new column object to the data warehouse. There are two methods for adding a new object to the data warehouse. One method is to use DACs Data Warehouse Configurator to create the physical columns for the tables in the data warehouse database. The other method is to add the column definitions directly in the data warehouse database and then use the DACs Import from Database command to add the new columns in the DAC. In this step and the steps that follow, you use the first method. You use the second method in the next lesson. a. Click Design to open the DAC design view. b. If necessary, select the Custom container in the list. c. Select Tables in the top pane. d. Query for W_ORG_DS. e. Click the Columns child tab. f. Query to confirm that the X_ACCOUNT_LOG column does not yet exist in W_ORG_DS. g. Click New. h. Enter the following values:
Name Position Data Type Length X_ACCOUNT_LOG 226 VARCHAR 150
Oracle BI Applications 7.9: Implementation for Oracle EBS 107

Lesson 13: Adding Columns to an Existing Dimension Table

Precision Nullable

0 Selected

i. Click Save. j. Return to the top pane Tables tab, query for W_ORG_D, and repeat the steps to add the same column to W_ORG_D with the following values (position is the only value that is different):
Name Position Data Type Length Precision Nullable X_ACCOUNT_LOG 255 VARCHAR 150 0 Selected

k. Click Save. 3. Use SQL*Plus to verify that the column does not yet exist in the tables in the data warehouse. a. Select Start > Programs > Oracle OraDB10g_home > Application Development > SQL Plus to open SQL*Plus. b. Log in as obaw with password obaw. c. At the SQL> prompt, enter:
select x_account_log from W_ORG_DS;

d. You should receive the error message: X_ACCOUNT_LOG invalid identifier. e. Repeat for W_ORG_D. f. Leave SQL*Plus open. 4. Create the new column in the data warehouse database. a. Select Tools > ETL Management > Configure. b. If necessary, select Oracle as the source and target databases and click OK. c. In the Data Warehouse Configuration Wizard, select Create Data Warehouse Tables and click Next. d. Enter or verify the data warehouse information.
Container Table Owner Password ODBC Data Source Data Area Index Area Custom OBAW OBAW OBAW OBAW_DATA OBAW_INDEX

e. Click Start. This may take a few minutes. f. You should receive a message that all tasks successfully finished. If you receive an error message, check the log files. g. Click Finish.

108

Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 13: Adding Columns to an Existing Dimension Table

h. Return to SQL*Plus and run the select statements again to verify that the X_ACCOUNT_LOG column now exists in both W_ORG_DS and W_ORG_D. You should receive messages that 888 rows exist in W_ORG_DS and 889 rows exist in W_ORG_D. i. Leave SQL*Plus open. 5. Create custom logical and physical task folders for the custom folders you created in the Informatica repository. a. Navigate to Tools > Seed Data > Task Folders. b. Use the New button to create four new custom logical and physical task folders for the custom folders you created in the Informatica repository:
Name Custom_Extract Custom_Load CUSTOM_SDE CUSTOM_SILOS Type Logical Logical Physical Physical

c. Click Save. d. Click Close. 6. Register the new custom folders. a. Navigate to Design > Source System Folders. b. Use the New button and the Edit tab to register the four new custom folders:
Logical Folder Custom_Extract Custom_Load Physical Folder CUSTOM_SDE CUSTOM_SILOS

7. Modify existing tasks to use the custom mappings and workflows. a. Navigate to Design > Tasks. b. Query for SDE_ORA_OrganizationDimension_Customer. c. In the Edit child tab, change the folder name to Custom_Extract. d. For this exercise, change Command for Full Load from SDE_ORA_OrganizationDimension_Customer_Full to SDE_ORA_OrganizationDimension_Customer. This is because you only copied and modified the SDE_ORA_OrganizationDimension_Customer workflow in the CUSTOM_SDE folder. This is for training purposes only and not the recommended procedure. e. Save and click Yes to confirm changes in the Updating message. f. Right-click SDE_ORA_OrganizationDimension_Customer and select Synchronize Tasks. g. Click Yes when prompted to proceed in the Synchronizing Tasks message. This may take a few moments. h. When synchronization completes, click OK in the Synchronizing task(s) message box.

Oracle BI Applications 7.9: Implementation for Oracle EBS

109

Lesson 13: Adding Columns to an Existing Dimension Table

i. Click the Target Tables child tab and verify that Truncate Always and Truncate for Full Load are both selected for the W_ORG_DS target table. j. Return to the Tasks tab in the top pane and query for Load into Organization Dimension. k. In the Edit child tab, change the folder name to Custom_Load. l. For this exercise, change Command for Full Load from SIL_OrganizationDimension_Full to SIL_OrganizationDimension. This is because you only copied and modified the SIL_OrganizationDimension workflow in the CUSTOM_SILOS folder. This is for training purposes only and not the recommended procedure. m. Save and click Yes to confirm changes. n. Right-click Load into Organization Dimension and select Synchronize Tasks. o. Click Yes in the Synchronizing task message box. p. When synchronization completes, click OK in the Synchronizing task(s) message box. 8. Create a custom subject area. a. Click the Subject Areas tab. b. Click New. c. Enter Custom Organization Dimension as the name of the subject area and click Save. d. Select the Tables child tab. e. Click Add/Remove. f. In the Choose Tables dialog box, query for W_ORG_D. g. Click Add to add W_ORG_D to the custom subject area. h. Click OK in the Adding window to acknowledge that W_ORG_D is added to the subject area. i. Click OK to close the Choose Tables dialog box. j. Click Save. 9. Add the tasks to the subject area. a. Verify that the Custom Organization Dimension subject area is selected. b. Click the Tasks child tab. c. Use the Add/Remove button to add the two tasks to the subject area: SDE_ORA_OrganizationDimension_Customer Load into Organization Dimension For this exercise and demonstration only, it is acceptable to manually add only these two tasks to the subject area instead of assembling the subject area. d. Click Save. 10. Create a custom execution plan and add the custom subject area to it. a. Click the Execute button to open the Execute view. b. Display the Execution Plans tab and click New. c. Enter Custom Organization Dimension as the name of the execution plan and click Save. d. Select the Subject Areas child tab. e. Click Add/Remove.

110

Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 13: Adding Columns to an Existing Dimension Table

f. In the left table list, select Custom Organization Dimension and click Add to add it to the right list of subject areas belonging to the custom execution plan. g. Click OK in the Adding window to acknowledge that the Custom Organization Dimension subject area was added to the execution plan. h. Click OK. i. Click Save. 11. Generate execution plan parameters for the custom execution plan. a. In the Execution Plan list, verify that Custom Organization Dimension is selected. b. Select the Parameters child tab. c. Click Generate. d. Click Yes to proceed. A list of parameters is generated. e. Click OK to acknowledge successful parameter generation. f. Set the DATASOURCE parameters:
DBConnection_OLTP DBConnection_OLAP ORA_11_5_10 DataWarehouse

g. If you need to verify that the values are correct, you can navigate to Setup > Physical Data Sources and check the names of the Source and Warehouse type connections. h. Leave the FOLDER parameters as they are. i. Click Save. 12. Build the execution plan. a. Verify that the Custom Organization Dimension execution plan is selected and click Build. b. Accept the default and click OK in the Building window. c. In the second Building window, again accept the default to perform the operation for the selected record only and click OK. Building the execution plan may take several minutes. d. Click OK to acknowledge that the Custom Organization Dimension execution plan has been successfully built. e. Click Refresh in the top pane. Notice that a time stamp now appears in the Last Designed column for the execution plan. f. Click the Ordered Tasks child tab for the Custom Organization Dimension execution plan and verify that the SDE_ORA_OrganizationDimension_Customer and Load into Organization Dimension tasks are listed. The QUERY_INDEX_CREATION task is also listed. 13. Reset the refresh date for the data warehouse. Because the full load you ran in the previous practice is recorded in the DAC repository, the refresh date for the transactional source tables is stored for your last run. This will cause an incremental update to be initiated based on the refresh date. This step will clear the refresh date from the source tables in the OLTP and allow a full load for your custom execution plan. a. Click Setup and select the Physical Data Sources tab. b. Select the ORA_11_5_10 connection, select the Refresh Dates subtab, and verify that there are refresh dates corresponding to your initial load of the OBAW. c. Repeat for the DataWarehouse connection.
Oracle BI Applications 7.9: Implementation for Oracle EBS 111

Lesson 13: Adding Columns to an Existing Dimension Table

d. Select Tools > ETL Management > Reset Data Warehouse. e. In the Reset Data Warehouse dialog box, confirm that you want to reset by entering the text and click Yes. f. Click OK to confirm that the reset was successful. g. Click Refresh in the lower pane and verify that the refresh dates for the transactional and data warehouse tables are cleared. 14. Before running the execution plan, query W_ORG_D and verify that there is no data in the X_ACCOUNT_LOG column. a. Return to SQL*Plus. b. If necessary, log in as obaw with password obaw. c. Run the following SQL statement:
select count(*) from W_ORG_D where X_ACCOUNT_LOG is NOT NULL;

d. Verify that no records are returned. 15. Verify that the DAC Server is started. a. Verify that the DAC Server Monitor icon in the upper-right corner of the DAC client resembles an orange electrical plug in a socket, which means that the client is connected to the server and the server is idle. When you move the cursor (mouse over) over the orange icon it should say, DAC Server is idle. When the DAC client cannot establish a connection to the DAC Server, the Server Monitor icon resembles a red electrical plug. If the client is connected to a server that is running an ETL process, the icon resembles a green electrical plug with a lightning sign superimposed on it. b. If the DAC Server is not started select Start > Programs > Oracle Business Intelligence > Oracle DAC > Start DAC Server and verify that the DAC Server Monitor icon resembles an orange electrical plug in a socket before continuing. 16. Run the custom execution plan. a. Return to the Execution Plans tab and select the Custom Organization Dimension execution plan. b. Click the Run Now button in the Top Pane toolbar. c. In the Starting ETL dialog box, click Yes to confirm that you want to start the execution plan. d. Click OK to acknowledge that the request has been successfully submitted to the Informatica Server. 17. Monitor the ETL plan execution. a. Select the Current Run tab. b. Select the Custom Organization Dimension run and confirm that it has a Run Status of Running. Note that the DAC Server Monitor icon has changed from yellow to green, indicating that a plan is being executed. c. Write down the Process ID of the current run. d. Click Refresh to refresh the status. e. Click Auto Refresh to verify the refresh frequency of 30 seconds. f. Select the Tasks tab in the bottom pane to view task status within the execution plan. g. Use the list to view different task statuses.
112 Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 13: Adding Columns to an Existing Dimension Table

18. View Run History. When all tasks have completed (about 2 minutes), select the Run History Top Pane tab to view the log file for the execution plan you just ran. a. Select the Custom Organization Dimension execution plan you just ran. b. Verify that run status = Completed, the number of failed tasks = 0, and the number of successful tasks = 3. c. Right-click Custom Organization Dimension and select Get Run information > Get log file. d. In the Input dialog box, click OK to accept the default log file name. e. In the Fetching log file dialog box, notice the path to which the log file has been saved and click OK. f. In Windows Explorer, navigate to C:\OracleBI\DAC\ServerLog and open Custom_Organization_Dimension.#.log. Note that the naming convention of the log files includes the ETL Process ID that you recorded in a previous step. g. Scroll down to the bottom of the file to the list of step statuses and confirm that there are no steps listed under Failed Sessions or Queued Sessions. 19. After running the execution plan, query W_ORG_D and verify that there is now data in the X_ACCOUNT_LOG column. a. Return to SQL*Plus, which should still be open. If not, open SQL*Plus and log in as obaw with password obaw. b. Run the following SQL statement:
select count(*) from W_ORG_D where X_ACCOUNT_LOG is NOT NULL;

c. Verify that 27 records are returned.

Oracle BI Applications 7.9: Implementation for Oracle EBS

113

Lesson 13: Adding Columns to an Existing Dimension Table

114

Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 14: Adding a New Dimension in the OBAW

Practice 14-1: Adding a New Dimension in the OBAW


Goals Scenario To create a new dimension table and a new dimension staging table in the data warehouse In this practice, you run a DDL script to create a new dimension table and a dimension staging table based on the standard data warehouse structure (with appropriate system columns). You then register the new source table and its staging table in the DAC repository and associate it with the appropriate database connection. 1520 minutes

Time

Instructions:
1. Run a script to create a new dimension table and a new dimension staging table in the data warehouse. a. If necessary, open SQL*Plus and log in as obaw with password obaw. b. Navigate to C:\PracticeFiles. c. Open the partner.sql file and examine the SQL. The script creates two new tables: a dimension table named WC_PARTNER_D and a dimension staging table named WC_PARTNER_DS. Note that the dimension staging table contains the required columns: DATASOURCE_NUM_ID and INTEGRATION_ID. The dimension table contains these two required columns as well as the required ETL_PROC_ID column. d. Copy the SQL and paste it into SQL*Plus. e. Run the script. f. Close SQL*Plus. g. Close the partner.sql file. 2. Import the tables into the DAC. a. If necessary, open the DAC client. b. Navigate to the Design view. c. Verify that the Custom container is selected. d. Click the Tables tab in the top pane. e. Right-click anywhere in the list and select Import from database > Import Database Tables. f. Choose the DataWarehouse data source. g. In the Table Name Filter field, enter WC_PARTNER*. h. Click Read Tables. i. Click OK to confirm that reading tables is complete. j. Select Import for the WC_PARTNER_DS and WC_PARTNER_D tables. k. Click Import Tables. l. Click OK to confirm that the import was successful.
Oracle BI Applications 7.9: Implementation for Oracle EBS 115

Lesson 14: Adding a New Dimension in the OBAW

3. Set properties for the tables. a. Query for the WC_PARTNER_DS and WC_PARTNER_D tables in the list or select Original from the list. b. Set the Dimension table type for WC_PARTNER_D and the Dimension Stage table type for WC_PARTNER_DS. c. Set the Warehouse flag for both tables. d. Save. 4. Import table columns. a. Select the WC_PARTNER_D table. b. Click the Columns subtab and notice that no columns are visible for this table. c. Right-click WC_PARTNER_D and select Import From Database > Import Database Columns. d. Accept the default: Selected record only and click OK. e. Pick the DataWarehouse data source. f. Click Read Columns. g. Click OK to confirm that reading the columns was successful. The columns appear in the column list. h. Click Import Columns. i. Click OK to confirm that importing the columns was successful. j. Click Refresh in the Columns subtab to verify that the columns are now visible in the DAC and that all the column properties are accurate. No properties need to be modified. k. Save. l. Repeat the steps for WC_PARTNER_DS. 5. Add a foreign key column to W_GL_REVN_F. a. Select All in the list. b. Query for W_GL_REVN_F. c. In the Columns child tab, click Refresh to view the columns. d. Sort on position. There should be 108 columns in W_GL_REVN_F. e. Add a new column named PARTNER_WID with the following properties:
Position Data Type Length Precision Foreign Key to Table Foreign Key to Column Nullable Default Value 109 Number 10 0 WC_PARTNER_D ROW_WID Selected 0

6. Add a foreign key column to W_GL_REVN_FS. a. Select All in the list. b. Query for W_GL_REVN_FS.
116 Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 14: Adding a New Dimension in the OBAW

c. In the Columns child tab, sort on position. There should be 102 columns in W_GL_REVN_FS. d. Add a new column named PARTNER_ID with the following properties:
Position Data Type Length Precision Nullable Default Value 103 VARCHAR 30 0 Selected 0

7. Create the new PARTNER_WID and PARTNER_ID columns in the W_GL_REVN_F and W_GL_REVN_FS tables in the data warehouse database. a. Select Tools > ETL Management > Configure. b. Select Oracle as the source and target database and click OK. c. In the Data Warehouse Configuration Wizard, select Create Data Warehouse Tables, and click Next. d. Enter or verify the data warehouse information.
Container Table Owner Password ODBC Data Source Data Area Index Area Custom obaw obaw obaw obaw_data obaw_index

e. Click Start. This may take a few minutes. f. You should receive a message that all tasks are successfully finished. If you receive an error message, check the log files. g. Click Finish. h. Navigate to C:\OracleBI\dac\conf\sqlgen\ctlfile to examine the SQL control file. i. Open the oracle_bi_dw.ctl file with Notepad. j. Search for W_GL_REVN_F. k. Scroll to look for column number 109 and verify that the PARTNER_WID column was generated. l. Search for W_GL_REVN_FS. m. Scroll to look for column number 103 and verify that the PARTNER_ID column was generated. n. Return to SQL*Plus. o. Run desc W_GL_REVN_F; to verify that the PARTNER_WID column now exists in the W_GL_REVN_F table in the data warehouse database. p. Run desc W_GL_REVN_FS; to verify that the PARTNER_ID column now exists in the W_GL_REVN_FS table in the data warehouse database.

Oracle BI Applications 7.9: Implementation for Oracle EBS

117

Lesson 14: Adding a New Dimension in the OBAW

Practice 14-2: Creating an SDE Mapping to Load the Dimension Staging Table
Goals Scenario To create a new SDE mapping to load the dimension staging table You have a spreadsheet (CSV file) with the dimension data that you want to import into the data warehouse. You use Informatica tools to create a new SDE mapping to load the data into the dimension staging table that you created in the previous practice. 1520 minutes

Time

Instructions:
1. Import the source file into Informatica. a. Navigate to C:\PracticeFiles and open the Partner.csv file. b. Examine the data. The file contains five rows of data with values for row ID, partner name, and partner location. c. Close the file. d. Copy the partner.csv file and paste it into C:\Informatica\PowerCenter8.1.1\server\infa_shared\SrcFiles. e. Open Informatica Designer. f. Open the CUSTOM_SDE folder. g. Select Tools > Source Analyzer. h. Select Sources > Import from File. i. In the Open Flat File dialog box, navigate to C:\Informatica\PowerCenter8.1.1\server\infa_shared\SrcFiles. j. Select All Files in the Files of type list. k. Select partner.csv and click OK. l. In the Flat File Import Wizard, verify that Delimited is selected. m. Enter Partner as the name for this source. n. Select Import field names from first line. o. Click Next. p. Accept all defaults in step 2 of the Flat File Import Wizard. q. Click Next. r. Change Length/Prec for PARTNER_NAME and PARTNER_LOC to 50. s. Click Finish. The Partner source appears in the Source Analyzer and is added as a source in CUSTOM_SDE > Sources > FlatFile. t. Save the repository. 2. Import the WC_PARTNER_DS target. a. Select Tools > Target Designer.
118 Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 14: Adding a New Dimension in the OBAW

Select Targets > Import from Database. Select the obaw ODBC data source. Enter obaw as Username, Owner Name, and Password. Click Connect. Expand obaw > Tables, select WC_PARTNER_DS, and click OK. WC_PARTNER_DS appears in the Warehouse Designer window and is added as a target in CUSTOM_SDE > Targets. g. Save the repository. b. c. d. e. f. 3. Create the SDE mapping. a. Select Tools > Mapping Designer. b. Select Mappings > Create. c. Name the new mapping SDE_Custom_PartnerDimension and click OK. 4. Add the source to the SDE mapping. First, turn on the option to create a Source Qualifier transformation when adding a new source. a. Select Tools > Options. b. In the Options dialog box, select the Tables tab. c. Select Mapping Designer in the Tools list and verify that the Create Source Qualifiers when opening Sources option is selected. d. Click OK. e. Navigate to CUSTOM_SDE > Sources > Flat File. f. Drag the Partner source into the Mapping Designer window and notice that a source qualifier transformation is created. 5. Add an expression transformation to the mapping. a. Select Transformation > Create. b. Select Expression in the list. c. Enter EXPTRANS as the name for the transformation. d. Click Create. e. Click Done. f. Drag all three columns from SQ_Partner Source Qualifier to the EXPTRANS expression. g. Select Mappings > Parameters and Variables to open the Declare Parameters and Variables dialog box. h. Click the Add a new variable to this table button. i. Add a new parameter with the following values:
Name Type Data type Prec Scale Initial Value $$DATASOURCE_NUM_ID Parameter decimal 10 0 15

j. Click OK to close the Declare Parameters and Variables dialog box.


Oracle BI Applications 7.9: Implementation for Oracle EBS 119

Lesson 14: Adding a New Dimension in the OBAW

k. Double-click the EXPTRANS expression. l. Click the Ports tab. m. Add a new output port with the following properties:
Port Name Data type Prec Scale I O V DATASOURCE_NUM_ID decimal 10 0 Not selected Selected Not selected

n. o. p. q. r. s. t. u.

Click the down arrow in the Expression field to open Expression Builder. Delete the existing expression. Click the Variables tab. Expand Mapping parameters. Double-click $$DATASOURCE_NUM_ID to add it to the expression. Click OK. Click Apply. Click OK.

6. Add the target to the mapping. a. Drag the WC_PARTNER_DS target to the right of the EXPTRANS expression transformation. b. Drag ROW_ID from EXPTRANS to INTEGRATION_ID in WC_PARTNER_DS. c. Link the other three columns in EXPTRANS to their corresponding columns in WC_PARTNER_DS. d. Select Mappings > Validate. e. You should receive the message Mapping SDE_Custom_PartnerDimension is VALID. f. Save the repository. 7. Create the SDE workflow. Typically, you would create two workflows, one to be used for a full load and the other to be used for an incremental load. Both workflows are based on the same mapping, which is executed during both full and incremental loads. This provides an opportunity to tune each of these load scenarios. For the purposes of this training, you create only one workflow for all the mappings in this set of practices. Please note that this is for training purposes only and is not the recommended practice. a. Select Tools > Workflow Manager. b. Verify that the CUSTOM_SDE folder is open. c. Select Workflows > Create. d. Name the workflow SDE_Custom_PartnerDimension and click OK. e. Select Tasks > Create. f. Leave task type set to Session and name the task SDE_Custom_PartnerDimension. g. Click Create.
120 Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 14: Adding a New Dimension in the OBAW

h. Associate the SDE_Custom_PartnerDimension mapping with the session and click OK. i. Click Done to close the Create Task dialog box. j. Select Tasks > Link Task and link the Start task to the SDE_Custom_PartnerDimension session. 8. Edit the workflow session properties. a. Double-click the SDE_Custom_PartnerDimension task. b. On the General tab, select both Fail parent if this task fails and Fail parent if this task does not run. c. Click the Properties tab. d. Verify that the value for the $Source connection value attribute is empty. e. For the $Target connection value attribute, click the down arrow in the Value field. f. Select Use Connection Variable and set the variable to $DBConnection_OLAP. g. Click OK. h. Click the Config Object tab. i. For the Stop on errors attribute, enter a value of 1. j. Click the Mapping tab. k. In the left pane, select the Sources node and then the SQ_Partner source. l. Under Readers in the right pane, verify that the SQ_Partner reader is set to File Reader. m. In the left pane, select Targets > WC_PARTNER_DS. n. In the Connection settings in the right pane for the WC_PARTNER_DS instance, click the down-arrow button in the Value property to edit its target connection. o. In the Relational Connection Browser, click the Use Connection Variable option button, enter $DBConnection_OLAP, and click OK. p. Set the target load type to Normal. q. Click Apply. r. Click OK. 9. Validate the workflow. a. Select Workflows > Validate. b. You should receive the message Workflow SDE_Custom_PartnerDimension is VALID. c. Save the repository.

Oracle BI Applications 7.9: Implementation for Oracle EBS

121

Lesson 14: Adding a New Dimension in the OBAW

Practice 14-3: Creating an SIL Mapping to Load the Dimension Table


Goals Scenario To create a new SIL mapping to load the dimension table In the previous practice, you created an SDE mapping to load the dimension staging table. In this practice, you use Informatica tools to create a new SIL mapping to load the dimension table. 1520 minutes

Time

Instructions:
1. Import the SIL source. a. In Informatica Designer, open the CUSTOM_SILOS folder. b. Select Tools > Source Analyzer. c. Select Sources > Import from Database. d. Select the obaw ODBC data source. e. Set username, owner name, and password to obaw and click Connect. f. Select OBAW > Tables > WC_PARTNER_DS and click OK. WC_PARTNER_DS appears in the Warehouse Designer window and is added as a source in CUSTOM_SILOS > Sources. 2. Import the SIL target. a. Select Tools > Target Designer. b. Select Targets > Import from Database. c. Select the obaw ODBC data source. d. Enter obaw as Username, Owner Name, and Password and click Connect. e. Expand OBAW > Tables, select WC_PARTNER_D, and click OK. WC_PARTNER_D appears in the Warehouse Designer window and is added as a target in CUSTOM_SILOS > Targets. f. Double-click WC_PARTNER_D in the Target Designer window. g. Click the Columns tab. h. Change the ROW_WID key type to PRIMARY KEY. i. Click Apply and OK. j. Save the repository. 3. Create the SIL mapping. a. Select Tools > Mapping Designer. b. Select Mappings > Create. c. Name the mapping SIL_Custom_PartnerDimension and click OK.
122 Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 14: Adding a New Dimension in the OBAW

4. Add the source and target to the mapping. a. Drag the WC_PARTNER_DS source into the mapping. b. Drag the WC_PARTNER_D target in the mapping and place it to the right of the SQ_WC_PARTNER_DS source qualifier. c. Link the four columns in the SQ_WC_PARTNER_DS source qualifier to their corresponding columns in the WC_PARTNER_D target definition. 5. Add a mapplet to the mapping that retrieves the ETL process ID. a. Navigate to CUSTOM_SILOS > Mapplets > MPLT_GET_ETL_PROC_WID. (This mapplet was copied to this folder when you copied the SIL_OrganizationDimension mapping in an earlier practice). b. Drag MPLT_GET_ETL_PROC_WID into the mapping and place it near the target definition. c. Drag INTEGRATION_ID from the source qualifier to the corresponding column in the MPLT_GET_ETL_PROC_WID mapplet. d. Drag ETL_PROC_WID from the MPLT_GET_ETL_PROC_WID mapplet to the corresponding column in the target definition. 6. Add a sequence transformation to the mapping that updates ROW_WID in the target definition. a. Select Transformation > Create. b. Select Sequence Generator in the type list. c. Name the transformation SEQTRANS. d. Click Create. e. Click Done. f. Drag NEXTVAL from the SEQTRANS sequence transformation to ROW_WID in the target definition. 7. Validate the mapping. a. Select Mappings > Validate. You should receive the message Mapping SIL_Custom_PartnerDimension is VALID. b. Save the repository. 8. Create a new workflow for the SIL_Custom_PartnerDimension mapping. a. Open Workflow Manager. b. Verify that the CUSTOM_SILOS folder is open. c. Create a new workflow named SIL_Custom_PartnerDimension. d. Select Tasks > Create. e. Leave task type set to Session and name the task SIL_Custom_PartnerDimension. f. Click Create. g. Associate the SIL_Custom_PartnerDimension mapping with the session and click OK. h. Click Done to close the Create Task dialog box. i. Select Tasks > Link Task and link the Start task to the SIL_Custom_PartnerDimension session. 9. Edit the workflow session properties.
Oracle BI Applications 7.9: Implementation for Oracle EBS 123

Lesson 14: Adding a New Dimension in the OBAW

a. Double-click the SIL_Custom_PartnerDimension session. b. On the General tab, select both Fail parent if this task fails and Fail parent if this task does not run. c. Click the Properties tab. d. Set the $Source connection value and $Target connection value attributes to $DBConnection_OLAP. e. Click the Config Object tab. f. For the Stop on errors attribute, enter a value of 1. g. Click the Mapping tab. h. In the left pane, select Sources > SQ_WC_PARTNER_DS. i. In the Connections settings in the right pane for the SQ_WC_PARTNER_DS instance, click the down-arrow button in the Value property to edit its source connection. j. In the Relational Connection Browser, click the Use Connection Variable option button and enter $DBConnection_OLAP, and then click OK. k. In the left pane, select Targets > WC_PARTNER_D. l. In the Connection settings in the right pane for the WC_PARTNER_D instance, click the down-arrow button in the Value property to edit its target connection. m. In the Relational Connection Browser, click the Use Connection Variable option button and enter $DBConnection_OLAP, and then click OK. n. Set target load type to Normal. o. Click Apply. p. Click OK. 10. Validate the workflow. a. Select Workflows > Validate. You should receive the message Workflow SIL_Custom_PartnerDimension is VALID. b. Save the repository.

124

Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 14: Adding a New Dimension in the OBAW

Practice 14-4: Creating an SDE Mapping to Load the Fact Staging Table
Goals Scenario Time To create a new SDE mapping to load the fact staging table You copy and modify an existing mapping and workflow to create a custom SDE mapping and workflow to load the fact staging table. 1520 minutes

Instructions:
1. Copy an existing SDE mapping to the custom folder. a. Open Informatica Repository Manager. b. Navigate to SDE_ORA11510_Adaptor > Mappings. c. Copy the SDE_ORA_GLRevenueFact mapping and paste it into the CUSTOM_SDE folder. d. Click Yes to confirm the copy. e. Expand CUSTOM_SDE > Mappings and verify that the SDE_ORA_GLRevenueFact mapping is copied. 2. Copy an existing SDE workflow to the custom folder. a. Navigate to SDE_ORA11510 > Workflows. b. Copy the SDE_ORA_GLRevenueFact workflow and paste it into the CUSTOM_SDE folder. c. Click Yes to confirm the copy. The Copy Wizard window opens. Select Reuse and apply this resolution to all conflicts. d. Click Next. e. Click Finish. f. Expand CUSTOM_SDE > Workflows and verify that the SDE_ORA_GLRevenueFact is copied. 3. Modify the target for the SDE mapping. a. Open Informatica Designer. b. Right-click the CUSTOM_SDE folder and select Disconnect. c. Right-click the CUSTOM_SDE folder again and select Connect to see the changes you made in Repository Designer. d. Open the CUSTOM_SDE folder. e. Navigate to CUSTOM_SDE > Targets. f. Select Tools > Target Designer. g. Drag W_GL_REVN_FS into the Target Designer.
Oracle BI Applications 7.9: Implementation for Oracle EBS 125

Lesson 14: Adding a New Dimension in the OBAW

h. i. j. k. l. m. n. o.

Double-click W_GL_REVN_FS to open it. Click the Columns tab. Scroll to the bottom and select the X_CUSTOM column. Click the Add a new column to this table button. Name the column PARTNER_ID. Set data type to varchar2 and prec to 15. Click Apply and OK. Select Repository > Save.

4. Modify the SDE mapping. a. Select Tools > Mapping Designer. b. Drag SDE_ORA_GLRevenueFact into the Mapping Designer. c. Select Transformation > Create. d. Select Expression in the list. e. Enter X_CUSTOM as the name. f. Click Create. g. Click Done. h. Drag the CUST_TRX_TYPE_ID column from the mplt_BC_ORA_GLRevenueFact mapplet into the X_CUSTOM expression transformation. i. Double click the X_CUSTOM transformation to open it. j. Click the Ports tab. k. Rename CUST_TRX_TYPE_ID to PARTNER_ID. For this exercise, you rename CUST_TRX_TYPE_ID and use it as the foreign key to the new WC_PARTNER_D dimension table. This is because the Partner spreadsheet you use as a source for the dimension uses known RA_CUSTOMER_TRX_ALL. CUST_TRX_TYPE_ID values as ROW_ID values. Please note this is for training purposes only and is not the recommended practice. l. Click Apply and OK. m. Drag PARTNER_ID from the X_CUSTOM expression transformation to PARTNER_ID in the W_GL_REVN_FS target. n. Select Mappings > Validate. You should receive the message Mapping SDE_ORA_GLRevenueFact is VALID. o. Save the repository. 5. Validate the workflow. a. Select Tools > Workflow Manager. b. Select Tools > Workflow Designer. c. If necessary, open the CUSTOM_SDE folder. It may be necessary to disconnect and reconnect to see the workflows. d. Select CUSTOM_SDE > Workflows. e. Drag the SDE_ORA_GLRevenueFact workflow into the Workflow Designer. f. Select Workflows > Validate. You should receive the message Workflow SDE_ORA_GLRevenueFact is VALID.
126 Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 14: Adding a New Dimension in the OBAW

Practice 14-5: Creating an SIL Mapping to Load the Fact Table


Goals Scenario To create a new SIL mapping to load the fact table In the previous practice, you modified an existing SDE mapping to load the fact staging table. In this practice, you use Informatica tools to modify an existing SIL mapping and validate workflows used to load the fact table. 1520 minutes

Time

Instructions:
1. Copy an existing SIL mapping to the custom folder. a. Open Informatica Repository Manager. b. Navigate to SILOS > Mappings. c. Copy the SIL_GLRevenueFact mapping and paste it into the CUSTOM_SILOS folder. d. Click Yes to confirm the copy. e. Expand CUSTOM_SILOS > Mappings and verify that the SIL_GLRevenueFact mapping is copied. 2. Copy an existing SIL workflow to the custom folder. a. Navigate to SILOS > Workflows. b. Copy the SIL_GLRevenueFact workflow and paste it into the CUSTOM_SILOS folder. c. Click Yes to confirm the copy. d. In the Copy Wizard, select Reuse and Apply this resolution to all conflicts. e. Click Next. f. Click Finish. g. Expand CUSTOM_SILOS > Workflows and verify that the SIL_GLRevenueFact workflow is copied. 3. Import sources. a. Return to Informatica Designer, which should still be open. b. To see the changes made in Repository Manager, right-click CUSTOM_SILOS and select Disconnect and then right-click CUSTOM_SILOS and select Connect. c. Open the CUSTOM_SILOS folder. d. Select Tools > Source Analyzer. e. Select Sources > Import from Database. f. Select the obaw ODBC data source, enter obaw as user name, owner name, and password, and click Connect. g. Expand obaw > Tables.
Oracle BI Applications 7.9: Implementation for Oracle EBS 127

Lesson 14: Adding a New Dimension in the OBAW

h. Use Ctrl+click to select the WC_PARTNER_D and W_GL_REVN_FS tables. i. Click OK. j. Select CUSTOM_SILOS > Sources > obaw and verify that the WC_PARTNER_D and W_REVN_FS tables appear as sources. 4. Import the target. a. Select Tools > Target Designer. b. Select Targets > Import from Database. c. Select the obaw ODBC data source, enter obaw as user name, owner name, and password, and click Connect. d. Expand obaw > Tables. e. Select W_GL_REVN_F and click OK. f. Select Apply to all tables, Retain user-defined PK-FK relationships, and Retain userdefined Descriptions. g. Click Replace. h. Select CUSTOM_SILOS > Targets and verify that the W_REVN_F table appears as a target. 5. Add the WC_PARTNER_D source to the SIL mapping. a. Select Tools > Mapping Designer. b. Select CUSTOM_SILOS > Mappings > SIL_GLRevenueFact. c. Drag SIL_GLRevenueFact into the Mapping Designer. d. Select CUSTOM_SILOS > Sources > OBAW > WC_PARTNER_D. e. Drag WC_PARTNER_D into the mapping. f. Delete the SQ_WC_PARTNER_D source qualifier from the mapping. g. Drag ROW_WID from WC_PARTNER_D onto a blank port in the Sq_W_GL_REVN_FS source qualifier. 6. Modify the source qualifier. a. Double-click Sq_W_GL_REVN_FS to open it. b. Click the Ports tab. c. Rename ROW_WID to PARTNER_WID. d. Click the Properties tab. e. Open the SQL Query. f. At the end of the SELECT clause, add:
,WC_PARTNER_D.ROW_WID

g. At the end of the FROM clause, add:


,WC_PARTNER_D

h. At the end of the WHERE clause, add:


LEFT OUTER JOIN WC_PARTNER_D ON W_GL_REVN_FS.PARTNER_ID = WC_PARTNER_D.INTEGRATION_ID

i. Click OK. j. Click Apply and OK. 7. Map PARTNER_WID through the transformations in the mapping.
128 Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 14: Adding a New Dimension in the OBAW

a. Drag PARTNER_WID from Sq_W_GL_REVN_FS to a blank port in Exp_Custom expression transformation. b. Double-click the EXP_Custom to open it. c. Rename the PARTNER_WID port to IN_PARTNER_WID and set it as an Input port only. d. Add a new port named PARTNER_WID and set it as an Output port only. e. Click the down arrow to open the Expression Editor for the PARTNER_WID column. f. Delete the existing expression, and enter the following expression:
IIF(ISNULL(IN_PARTNER_WID),0,IN_PARTNER_WID)

g. h. i. j.

You do this to manage any nulls, updating them to zero so that they can be loaded into the target table. Click OK. Click Apply and OK. Drag PARTNER_WID from EXP_Custom to the blank port below X_CUSTOM in Upd_W_GL_REVN_F_Ins_Upd. Drag PARTNER_WID from Upd_W_GL_REVN_F_Ins_Upd to PARTNER_WID in the W_GL_REVN_F target.

8. Validate the mapping. a. Select Mappings > Validate. You should receive the message Mapping SIL_GLRevenueFact is VALID. b. Save the repository. 9. Validate the workflow. a. Select Tools > Workflow Manager. b. Select Tools > Workflow Designer. c. Verify that the CUSTOM_SILOS folder is open. d. Expand CUSTOM_SILOS > Workflows. e. Drag the SIL_GLRevenueFact workflow into the Workflow Designer. f. Select Workflows > Validate. You should receive the message Workflow SIL_GLRevenueFact is VALID. g. Save the repository. h. Save all changes if prompted. i. Close all open Informatica clients.

Oracle BI Applications 7.9: Implementation for Oracle EBS

129

Lesson 14: Adding a New Dimension in the OBAW

Practice 14-6: Adding DAC Tasks and Running Customized ETL


Goals Scenario To configure the DAC to run a custom execution plan You have built the sessions and workflows in Informatica Workflow Manager to run the SDE and SIL mappings. Now you must modify the DAC tasks, add them to a custom subject area and execution plan, and run the ETL to load data into the W_REVN_F fact table in the data warehouse. 2030 minutes

Time

Instructions:
1. Create and synchronize tasks in DAC. a. If necessary, open the DAC client and log in as dac with password dac. b. Navigate to Design > Tasks. c. Verify that the Custom container is selected. d. Create a new task with the following properties:
Name Command for Incremental Load Command for Full Load Folder Name Primary Source Primary Target Task Phase Execution Type Priority Custom Extract for Partner Dimension SDE_Custom_PartnerDimension SDE_Custom_PartnerDimension Custom_Extract FlatFileConnection DBConnection_OLAP Extract Dimension Informatica 5

Save the task. Right-click the task and select Synchronize Tasks. Click OK to accept the default, Selected record only. Click Yes when prompted to update source and target tables. When synchronization completes, click OK in the Synchronizing task message box. Click the Target Tables child tab and verify that both Truncate Always and Truncate for Full Load are selected for the WC_PARTNER_DS target table. k. Click Save.
130 Oracle BI Applications 7.9: Implementation for Oracle EBS

e. f. g. h. i. j.

Lesson 14: Adding a New Dimension in the OBAW

l. Create another new task with the following properties:


Name Command for Incremental Load Command for Full Load Folder Name Primary Source Primary Target Task Phase Execution Type Priority Custom Load into Partner Dimension SIL_Custom_PartnerDimension SIL_Custom_PartnerDimension Custom_Load DBConnection_OLAP DBConnection_OLAP Load Dimension Informatica 5

m. n. o. p. q. r.

Save the task. Right-click the task and synchronize the tasks for the selected record only. Click OK. Click Yes when prompted to update source and target tables. When synchronization completes, click OK in the Synchronizing task message box. Create another new task with the following properties:
Name Command for Incremental Load Command for Full Load Folder Name Primary Source Primary Target Task Phase Execution Type Priority Custom Extract for Revenue Fact SDE_ORA_GLRevenueFact SDE_ORA_GLRevenueFact Custom_Extract DBConnection_OLTP DBConnection_OLAP Extract Fact Informatica 5

Save the task. Right-click the task and select synchronize tasks for the selected record only. Click OK. Click Yes when prompted to update source and target tables. When synchronization completes, click OK in the Synchronizing task message box. Click the Target Tables child tab and verify that both Truncate Always and Truncate for Full Load are selected for the W_GL_REVN_FS target table. y. Click Save. z. Create another new task with the following properties:
Oracle BI Applications 7.9: Implementation for Oracle EBS 131

s. t. u. v. w. x.

Lesson 14: Adding a New Dimension in the OBAW

Name Command for Incremental Load Command for Full Load Folder Name Primary Source Primary Target Task Phase Execution Type Priority

Custom Update for Revenue Fact SIL_GLRevenueFact SIL_GLRevenueFact Custom_Load DBConnection_OLAP DBConnection_OLAP Update Fact Informatica 5

aa. Save the task. bb. Right-click the task and synchronize tasks for the selected record only. cc. Click OK. dd. Click Yes when prompted to update source and target tables. ee. When synchronization completes, click OK in the Synchronizing task message box. 2. Create a new subject area. a. Click the Subject Areas tab. b. Create and save the new subject area named Custom Revenue Update. c. Click the Tables child tab and use the Add/Remove button to add W_GL_REVN_F. d. Click the Tasks child tab. e. Use the Add/Remove button to add the four custom tasks: Custom Extract for Partner Dimension Custom Extract for Revenue Fact Custom Load into Partner Dimension Custom Update for Revenue Fact f. Save the subject area. 3. Create a new execution plan. a. Click Execute. b. Create and save a new execution plan named Custom Revenue Update. c. Click the Subject Areas child tab and add the Custom Revenue Update subject area. d. Click the Parameters child tab and click Generate to generate parameters for the execution plan. e. Set the data sources as follows:
DBConnection_OLAP DBConnection_OLTP FlatFileConnection DataWarehouse ORA_11_5_10 ORA_11_5_10_Flatfile

f. Save the execution plan. g. Click Build. h. Accept the default and click OK in the Building window.
132 Oracle BI Applications 7.9: Implementation for Oracle EBS

Lesson 14: Adding a New Dimension in the OBAW

i. In the second Building window, again accept the default and click OK. As with assembling a subject area, building the execution plan may take several minutes. j. Click OK to acknowledge that the execution plan has been successfully built. k. Click the Ordered Tasks child tab. l. Verify that the expected four custom tasks appear in the list along with the QUERY_INDEX_CREATION task. 4. Reset the refresh date for the data warehouse. a. Select Tools > ETL Management > Reset Data Warehouse. b. In the Reset Data Warehouse dialog box, confirm that you want to reset by entering the text and click Yes. c. Click OK to confirm that the reset was successful. 5. Before running the execution plan, query WC_PARTNER_D and verify that there is no data in the table. a. Open SQL*Plus. b. Log in as obaw with password obaw. c. Run the following SQL statement:
select count(*) from WC_PARTNER_D where PARTNER_NAME is not null;

d. Verify that no records are returned. 6. Run the execution plan. a. In the Execution Plans tab, select Custom Revenue Update. b. Click the Run Now button in the Top Pane toolbar. c. In the Starting ETL dialog box, click Yes to confirm that you want to start the execution plan. d. Click OK to acknowledge that the request has been successfully submitted to the Informatica Server. 7. Monitor the ETL plan execution. a. Select the Current Run tab. b. Select the Custom Revenue Update run and confirm that it has a Run Status of Running. Note that the DAC Server Monitor icon has changed from yellow to green, indicating that a plan is being executed. c. Write down the process ID of the current run. d. Click Refresh to refresh the status. e. Click Auto Refresh to verify the refresh frequency of 30 seconds. f. Select the Tasks tab in the bottom pane to view task status within the execution plan. g. Use the drop-down menu to view different task statuses. 8. View Run History. a. When the execution plan completes (approximately 5 minutes), select the Run History tab. b. Select the Custom Revenue Update execution plan you just ran.

Oracle BI Applications 7.9: Implementation for Oracle EBS

133

Lesson 14: Adding a New Dimension in the OBAW

c. Verify that Run Status = Completed, Number of Failed Tasks = 0, and Number of Successful Tasks = 5. If any tasks fail, use the log files to troubleshoot. If you need assistance, ask your instructor. d. Right-click Custom Revenue Update and select Get Run information > Get log file. e. In the Input dialog box, click OK to accept the default log file name. f. In the Fetching log file dialog box, note the path to which the log file has been saved and click OK. g. In Windows Explorer, navigate to C:\OracleBI\DAC\ServerLog\ and open Custom_Revenue_Update.#.log. Note that the naming convention of the log files includes the ETL Process ID that you recorded in a previous step. h. Scroll down to the bottom of the file to the list of step statuses and confirm that there are no steps listed under Failed Sessions or Queued Sessions. 9. After running the execution plan, query WC_PARTNER_D and verify that there is now data in the table. a. Open SQL*Plus. b. Log in as obaw with password obaw. c. Run the following SQL statement:
select count(*) from WC_PARTNER_D where PARTNER_NAME is not null;

d. Verify that five records are returned.

134

Oracle BI Applications 7.9: Implementation for Oracle EBS

Você também pode gostar