Você está na página 1de 10

1

Centralizing Credit Management


with R12, Customer Data Hub and OBIEE

Wrightson Jackson and Craig Anderson
Hitachi Consulting Corporation

A Knowledge-Driven Consulting White Paper
2011 Hitachi Consulting Corporation



2
Contents

Introduction ........................................................................................................................ 3
Business Objectives .......................................................................................................... 3
Designing the Solution ...................................................................................................... 3
Credit Management in R12 ................................................................................................ 3
Credit and Collections Data Mart in OBIEE ...................................................................... 4
Customer Data Hub ............................................................................................................ 5
Approach to Integration ..................................................................................................... 6
Data and Flow ..................................................................................................................... 6
Oracle Advanced Queuing (Streams) ............................................................................... 7
JMN Messages .................................................................................................................... 7
XML Messages .................................................................................................................... 8
Conversion using Messaging ............................................................................................ 8
Conclusion .......................................................................................................................... 9
Acknowledgements ............................................................................................................ 9
About Hitachi Consulting Corporation ........................................................................... 10
About Hitachi .................................................................................................................... 10




3
Introduction
This case study covers integration of Oracle R12 eBusiness Suite with a custom Oracle
Business Intelligence Enterprise Edition reporting solution plus an external
implementation of Oracle Customer Data Hub. It highlights the uniqueness of
integrating these three technologies for a robust credit management and analysis
solution involving multiple legacy systems with high integration complexity.

Each of these three technologies delivered a critical aspect of the overall solution. This
paper emphasizes considerations for both the technical integration of these
technologies and their deployment as individual components to unique user groups.

Business Objectives
As credit was relatively easy to attain in the early 2000s, many companies grew
significant lines of business based on providing low interest loans to their customers
who would then invest in more products and services. As the economy changed
drastically during the recession of 2008, many businesses were faced with difficult
decisions regarding their lending and credit management policies. The project
described in this case study took place at an Oracle EBS client looking to enable a
robust credit management and lending organization to address the challenges of the
changing economy through the strategic deployment of Oracles Enterprise
technologies.

The business case for this project consisted of one primary goal: provide management
with the tools required to most effectively manage overall debt exposure. The client
pursued two core strategies to achieve this goal; the first was a process change to
centralize credit analytics and the second was to provide the critical data points
required for credit decisions in a consistent, reliable manner. Our client operated in a
highly distributed model with locations across North America where policies and
procedures had developed organically within each region, or, in many cases, at each
site. Through centralizing the credit management process, the business planned to
maximize efficiencies and increase risk management capabilities while enabling a
consistent strategy for dealing with credit and collections. To support these processes,
we deployed technology providing a comprehensive view of the customer including debt
across all locations and enabled the analysis of trends to allow potential credit issues to
be addressed before becoming actual delinquencies.

Designing the Solution
When designing the solution, it was important to remain within the overall enterprise IT
strategy and provide a flexible data model and platform that would be compatible with other
initiatives. The client was an early adopter of R12 Financials and had a long term strategy
to increase their Oracle footprint. Early in the engagement we validated that a combination
of R12 Credit Management, Oracles Customer Data Hub (CDH), and Oracle Business
Intelligence Enterprise Edition (OBIEE) would provide the integrated solution required to
achieve the goals set forth in the business case. While implementing each of these tools
individually presents challenges, the requirement to deploy a fully integrated solution
created a very unique project. The following sections will discuss our experiences, findings,
and lessons learned for the deployment of each component technology as well as the
integration of the complete solution.

Credit Management in R12
Oracles Credit Management module is relatively young and has typically been used for
some Dunn and Bradstreet reporting and manual population of the Credit Limit field to cap
the total value of outstanding sales orders. However, all the infrastructure and components
required to support a robust credit management process exist in R12. Our task was to
utilize these components and extend (not customize) them to deliver the functionality
required by a mature lending and underwriting organization.




4
In order to address the many data requirements unique to an underwriting business, we
utilized the Additional Data Points feature of Credit Management. Additional Data Points
are very similar to Descriptive Flexfields (DFF) in that they allow for custom fields to be
added to a standard Oracle form in a way that is fully supported. However, there are a
couple of key differences. Unlike DFFs, one can create as many Additional Data Points as
are required on the Credit Application; where most DFFs max out on any given form at
around fifteen custom fields, we added over one hundred custom data points to the
standard credit application form. These custom fields can also be configured in a parent-
child structure that allows you to design an intelligent form that is more user-friendly. One
limitation of the Additional Data Points that we have yet to solve is that there is not a way to
configure them that will define the order in which they appear on the application.

Another critical extension within Credit Management was to modify the approval workflow.
The standard workflow is divided into two distinct processes. The first process is the
standard application workflow which routes the application from initiation to data gathering
and analysis before the application is finalized and submitted for approval. Upon
submission for approval the second workflow process begins and the application is routed
through a defined approval hierarchy based on recommended credit limit. This design is a
good fit for many credit departments that operate with a standard structure involving credit
analysts and credit managers. A larger and more complex credit department will require
that applications are updated and analyzed by multiple specialized groups. This structure
will require modifications through the Approvals Management Engine (AME).

One extension the client required was an update to the second workflow to allow approvers
more flexibility. By default, once the application is submitted for approval, it is frozen and
can no longer be updated. Therefore, an approvers only option is to approve or reject the
application. Because the application is frozen, a rejection will require complete rework of
the application from square one, regardless of the reason for rejection. We extended the
approval workflow to include a Request More Information option where an approver could
ask for specific data points to be updated. The application was returned to an open status
in the first workflow.

Reporting out of Credit Management remains a custom build for each implementation as
there are no standard reports available. Two reports that were deemed as business critical
were the Pending Application report and the Standard Contract. The Pending Application
report served as a communication tool for customer service representative to reference to
keep customers apprised of their applications status. The Standard Contract was a
template that leveraged the information stored in parent-child Additional Data Points to
create formal contracts that could be provided to the customer upon approval of an
application.

Credit and Collections Data Mart in OBIEE
One aspect of this engagement that was critical to the success of the project was providing
the decision makers in the credit department with the information they needed to make
quick and effective decisions. OBIEE provided ideal framework to sustain the reporting
solution: a custom built, best practice layered architecture including a staging layer, a 3rd-
Normal-Form (3NF) relational Data Warehouse layer and a dimensional Data Mart. The
OBIEE Data Mart is easily accessible by any Oracle EBS user and could provide the
comprehensive view of the customer that the client required.

Multiple systems would feed data into the Data Mart including their global CRM system,
legacy operations systems, Oracle Financials and CDH. The intent was to collect all
relevant transactional data associated with a customer (or potential customer) to be
provided as a point-in-time view, as well as historical views for trend analysis. The
attributes collected, calculated, and stored for each customer belonged to one of six areas
of analysis: Application, Decision, Payment, Accounts Receivable, Write-Off, and
Recovery. These were captured at the customer account level but could be aggregated to
customer party or internal organizational views.

The unique requirements of this project led to advanced RPD development techniques.
One technique was to use session variables populated in a prompt to limit the data on a



5
logical table source. Additionally, reports were populated with different logical table sources
from the same physical fact on one report, which allowed each section within a report to
span different time frames.


Customer Data Hub
Two major deficiencies of the legacy system architecture that impacted not only the credit
department, but also the rest of the enterprise, were the existence of multiple views of the
same customer and the inconsistency of data. Our approach was to use this project as an
opportunity to deploy best practices in Master Data Management (MDM) enabled through
Oracles R12 Customer Data Hub. We designed CDH as a true MDM hub integrated with
established source systems to become the system of record for master customer data.

Our benchmarks for master data activity on customer records indicated that we could
expect ten thousand to fifty thousand updates each day. The ideal architecture in this
circumstance was to deploy CDH in a stand-alone environment on its own hardware. The
other option was to utilize the CDH functionality as it existed in their current eBusiness
Suite environment. However, this would encumber the financial system with processing of
each update to the customer record. By deploying the CDH as a true central MDM hub, it
could collect and merge the updates from multiple sources and send final consolidated
updates to the transactions systems in batches.

The feature of CDH that makes this solution possible is Single Source of Truth (SST). SST
allows you to determine survivorship of customer attributes as records from multiple
systems are merged into one. A challenge we faced with SST was that its scope is limited
to certain attributes and decision criteria that prevented the solution from addressing
several critical requirements. The final design included utilizing the User Hook option on
the customer merge process as well as some custom business event subscriptions to
enhance the native SST functionality.

































6
Approach to Integration
The volume and nature of the customer data involved necessitated that all three
technologies be deployed in one production release. We mitigated the inherent risks of this
approach through sophisticated messaging tools and a consistent data exchange protocol
that could be utilized during data conversion and cutover in addition to steady state
interfaces. The essential components of this integration strategy are described below.








































Data and Flow
As depicted in the diagram above, one aspect of this project is the enterprise-wide policy to
utilize XML messaging for all interfaces and to centralize the transport in a spoke-and-hub
communications architecture using TIBCO as the hub.

On a nightly basis, the customer information from the CRM and Operations systems (new
and changes) are published as XML messages. TIBCO routes that message to CDH where
the Data Quality Module assesses whether this is a potential duplicate of an existing dealer
based on the keys (primary keys) in the XML message. The party merge is run on a daily
basis to merge the CRM and Operations parties based on the Registration ID (unique
primary key from 3rd Party). CDH publishes the dealer message to TIBCO for the survivor
party which routes it to both EBS and Credit & Collections Data Mart (CCDM) built on
OBIEE technology.




7

All systems process the inbound message as an UPSERT. All messages contain a
section containing unique keys for the entity on each system. When a message is
processed by any system it first checks its key repository to determine if the entity already
exists; if it does then the data is treated as an update, but if not the data is treated as an
insert, hence the term UPSERT.

Applications for new lines of credit or increase in credit lines are collected at the Stores and
sent via an imaging system to the credit department where it is entered in Oracle Credit
Management (OCM). In the case of new customers this application may arrive in EBS
before the customer information propagates through the messaging system, requiring the
analyst to search for the customer and create a new one if not found. The creation of the
account (there is no message triggered on creation of a party in EBS) triggers the publish
process from EBS to CDH and CCDM. If the credit application is approved then the new
account information is published to the Operations system.


Oracle Advanced Queuing (Streams)
Oracles Advanced Queuing product, Oracle AQ, is one component within the broader
Streams product. The full Streams product is designed to facilitate replication of
database data across multiple instances. In this context, messaging is simply one step
in the middle of a logical sequence: capturing database changes staging the change
records in an Advanced Queue, potentially propagating the queued changes to other
queues, application of the change records to one or more target databases, and
resolution of conflicts.

AQ itself is available by default in every Enterprise Edition database instance and is the
foundation for EBS Workflow. It is implemented as a set of tables with several
supporting views and processes. Each queue is a table that holds the message. All
queue tables have the same structure and the message is actually kept in one Large
Object Binary, or LOB, field. You may define any structure you wish as the payload to
be stored in that column. A queue is defined, started, stopped, written to, and read
from, using methods in the DBMS_AQ and DBMS_AQADM packages. These packages
are described in the Oracle Streams Advanced Queuing Users Guide which is
available on line on the High Availability page.

Defining a new queue is a three step process: create the queue table (where the
messages are actually stored), create the queue (the views and processes that
implement the queue logic), and the start the queue (start the processes). All the
features one expects in a message queue such as message sequencing (FIFO, LIFO,
etc.), publish/subscribe, multi-consumers, robust store and forward capabilities and
guaranteed delivery, are provided by processes that are implemented on the database
server as separate threads dedicated to queuing (processes can be seen in the OS as
qmn001, qmn002, etc.). For this reason, it is not possible to perform operations (insert,
update, delete, etc.) on the queue tables directly. Doing so is very likely to destroy the
integrity of the queue beyond recovery.


JMN Messages
There are two basic components to the message being queued: the data (payload) and
the envelope it in which it is delivered (JMS, in our case). The payload can be any data
in any format, as long as it fits in a LOB and is in a format both the sender and
recipients agree upon. In our case, the payload is always an XML document, one of
many XML messages which conform to a set of XSDs, each representing a particular
kind of data.

The payload is wrapped in a logical envelop, a protocol, which is defined by the JMS
standard. This determines which protocol and supporting structures will be used as the
message envelop. The JMS protocol consists of a header structure and a means of



8
adding properties. The header typically contains standard values: fields for destination,
priority, timestamp, unique ID, etc. The properties are name-value pairs; some pairs are
pre-defined by the JMS standard, some are pre-defined by Oracle, and some may also
be defined by the developer. In our case, we used two custom parameters: one to hold
the name of the message used by the TIBCO routing process to efficiently determine
where to send the message, and a second version property. The version is intended to
make it easier to transition gracefully to future changes in the message structure.
Before calling the enqueue method of DBMS_AQ, we fill in the various parameters,
header values, and properties using other methods of the package.


XML Messages
The development standards we defined with the long term goals in mind required that
each message must conform to an XSD defined by the enterprises Canonical. The
messages are never fragments and always hold the entire dataset for one of the top-
level objects: Customer (Organization), Person, Account, or Relationship. XSDs are
hierarchal lists of elements and, in our model, went eight levels deep for some of the
objects. One of the challenges of the solution was the creation of the XML document
from code. Oracle provides several tools for generating XML, such as:
SQL functions (i.e. XMLROOT, XMLFOREST, etc.)
DBMS_XMLGEN
SYS_XMLGEN and SYS_XMLAGG
XSQL Pages Publishing Framework
XML SQL Utility (XSU)
XML DB
A very simple, quick method to create XML text from code is to create an XML
document with all the expected elements of the XSD. Often, an XML tool like
JDeveloper can make this job easy, since it can create an XML document using the
XSD as a template and it can create the XSD from an XML document. We were able to
copy and paste the XML document into our code as literal text, then replace the actual
data with variable names surrounded by concatenation operators. This was quick and
eliminated the risk of many simple mistakes, but only works well for simple documents.
If the XML document includes repeating or optional elements, the code can become
difficult to read and maintain.

The next challenge was retrieving the data to write to the XML document. It is good
design to separate the function of retrieving the data to be written from the function of
formatting the output of the data. This design required a means of communicating a
fairly large amount of data (typically several dozens of element names and values)
efficiently. Ideally, we wanted to reflect the hierarchal data structure that implements the
XSD; this could easily be done as an object and was a good application for Object
Tables.

In this case, we built object tables for each of 4 XSD nodes which corresponded to
messages: Customer (Organization), Person, Account, and Relationship. The tables
are populated either by processing an inbound message from another system or by
events triggered when TCA data is entered in the front end. The data is then read by a
process that formats the data into an XML document and queues a message for
transmission to other systems.


Conversion using Messaging
Prior to go-live, the messaging system was used to load the customer data from the
source system. It was viewed as a way to avoid developing throw-away conversion
code, as well as an opportunity to thoroughly validate the interface logic to be used



9
once the system was live. However, when the data extraction queries were run against
the actual production databases, we received nearly three million records
approximately four times as many as anticipated.

This amount of data would normally be processed in a day or two using traditional
conversion techniques. However, it is far less efficient to convert data to a text-based
message, parse the text data, and transform from the XSD format to data needed by
the TCA layer of Oracle Applications. These are all added steps incurred by the use of
messaging, on top of the TCA calls to actually validate and load the data. This
conversion took seven days of continuous processing as the data was sent to the CDH
tier, de-duplicated, then propagated via messaging to EBS.

During this process, we identified an important procedure for running high volumes of
data through the CDH Automerge process. The step within the Automerge that
prepares the individual batches for the merge of records for a single party operates in a
loop without any commit. It does this because it sets a SAVEPOINT so that it can roll
back to that point if necessary. However, that means that very large batches consume
rollback and temporary segments, potentially exhausting them. Our test, which should
have created 160,000 batches, each merging two or more parties, ran for more than
thirty hours without completing before we elected to terminate it. Our solution to this
problem was to define multiple, smaller batches based upon the fist letter of the
customer name.


Conclusion
In addition to the findings highlighted above, the Conference Room Pilot (CRP)
approach to testing proved critical to the quality assurance process on this
engagement. Bringing key stakeholders and user groups for all three technologies
together to execute focused testing activities helped ensure that the overall strategic
integrity of the project was maintained across the entire solution. Integrating these three
unique but complementary to achieve a strategic goal made for an extremely
challenging project that was equally as rewarding.


Acknowledgements
We would like to thank Anita Chow, Brian Hill, Sergei Kiriukhin, Glynn Malagarie, and
Suja Subramanian for their contributions to this project and case study. Without their
support and tremendous effort, neither could have been accomplished.




10
About Hitachi Consulting Corporation
As Hitachi, Ltd.'s (NYSE: HIT) global consulting company, with operations in the United
States, Europe and Asia, Hitachi Consulting is a recognized leader in delivering proven
business and IT strategies and solutions to Global 2000 companies across many
industries. With a balanced view of strategy, people, process and technology, we work
with companies to understand their unique business needs, and to develop and
implement practical business strategies and technology solutions. From business
strategy development through application deployment, our consultants are committed to
helping clients quickly realize measurable business value and achieve sustainable ROI.

Hitachi Consulting's client base includes 25 percent of the Global 100 as well as many
leading mid-market companies. We offer a client-focused, collaborative approach and
transfer knowledge throughout each engagement.

For more information, call 1.877.664.0010 or visit www.hitachiconsulting.com.

About Hitachi
Hitachi, Ltd., (NYSE: HIT / TSE: 6501), headquartered in Tokyo, Japan, is a leading
global electronics company with approximately 390,000 employees worldwide. Fiscal
2007 (ended March 31, 2008) consolidated revenues totaled 11,226 billion yen ($112.2
billion). The company offers a wide range of systems, products and services in market
sectors including information systems, electronic devices, power and industrial systems,
consumer products, materials, logistics and financial services. For more information on
Hitachi, please visit the company's website at http://www.hitachi.com.

For more information on Hitachi, please visit the company's Web site at
http://www.hitachi.com/.


2011 Hitachi Consulting Corporation. All rights reserved. Building the Market Responsive Company, is a registered
service mark of Hitachi Consulting Corporation.

Você também pode gostar