Escolar Documentos
Profissional Documentos
Cultura Documentos
BUSINESS
WHITE PAPER
STRATEGIES
By Mike Ferguson
Intelligent Business Strategies
November 2011
Prepared for:
Table of Contents
Introduction ........................................................................................................... 3
What is Active Intelligence? .................................................................................. 4
On-demand and Event-driven Analytics - Why every business needs them ......... 6
On-demand Active Intelligence .................................................................. 6
Event-driven active intelligence .................................................................. 7
Near Real-time Data .................................................................................. 8
Automated Analysis ................................................................................... 8
Automated Actions ..................................................................................... 9
Why Intelligence Must Go Enterprise-Wide to Maximise Business Value ........... 10
Key Questions When Implementing Active Intelligence ........................... 11
Whats Possible With Active Intelligence?........................................................... 12
Using Active Intelligence To Optimize A Supply Chain ............................ 12
Using Active Intelligence To Improve Procurement .................................. 13
Using Active Intelligence To Improve Risk Management ......................... 14
Summary of Whats Possible ................................................................... 14
Architectural Change: A DW Journey to the Centre of the Enterprise................. 15
Requirements: What to Look For in an Active Intelligence Solution .................... 17
Product Example: Teradata Active Enterprise Intelligence ............................... 20
End-to-End Technologies In A Teradata Active Enterprise Intelligence
Environment ............................................................................................. 21
Teradata Database In An Active Intelligence Environment ...................... 22
Conclusion .......................................................................................................... 25
INTRODUCTION
BI systems are used
mainly at tactical and
strategic levels today
In many organisations today, business intelligence (BI) systems are now well
established, supporting decision making in many business areas. Most of these BI
systems are typically used by business analysts, line managers and executives to
support decision making at tactical and strategic levels with finance, sales and
marketing often dominating BI usage. Yet, despite the maturity in the BI market, the
demand for intelligence has never been so strong. This is still a vibrant market with
technologies such as data warehouse appliances, big data visualization, Hadoop
and new analytical algorithms now making it possible to undertake more complex
analyses on much larger volumes of detailed data to answer questions that could
never be answered before.
However, while this all continues to offer business value, it still keeps BI in the
hands of the few when many executives today would much prefer it to be in the
hands of the many. More specifically, they would like to use BI to empower the
people in their business operations and not just in back office analysis and
managerial roles. For example, what if all customer- facing staff had on-demand
access to intelligence about each specific customer as they dealt with that
customer? What if they were guided on what actions to take to boost customer
profitability, deliver better customer service, be more personal and avoid risk? But
that is just the front-office. What about other operational areas? Why cant they
also run smart? For example, what if people in retail distribution centres could
see real-time intelligence on inbound deliveries as well as actual sales and
inventory in each store on a continuous basis? What if they could see trends
emerging and got alerts on predicted stock-outs based on actual sales so they
could match supply with demand every time. There is also a need to automate
analysis to automatically see opportunities and problems and to guide the business
as it operates. For example, continuously monitoring spend activity across a
department could predict problems ahead of time so that spending budgets are not
exceeded.
Thousands of small
operational decisions
are still made without
any kind of guidance
The point here is that there are literally thousands of small decisions that are made
every day in operations, and an organisation should not be entirely reliant on
business analysts to see everything. What many chief operating officers are now
asking is: Why cant people and applications involved in those decisions leverage
intelligence to help them act in a more timely and effective way so that all the small
decisions taken add up to making a major contribution to overall business
performance?
BI systems need to
become active in
business operations to
help people and
systems become more
effective
These requirements mean moving BI systems beyond just having a passive role in
supporting tactical and strategic decision making to having an always on active
role in operational decisions as well. The ultimate objective is to get to the point
where BI systems are continuously monitoring, managing and driving all business
operations on a 24x365 basis. Achieving smart operations requires organisations
to integrate BI into their core operational business processes so that front line
employees are constantly alerted and guided to act in a more timely and effective
way than they do today.
This paper looks at this transition and asks What is Active Intelligence? It also
looks at how BI systems have to change to help people and applications become
more effective in the tasks they perform and more responsive to business events
as and when they happen. It then looks at some business examples of what is
possible, the requirements that need to be met by active intelligence solutions and
how one vendor, Teradata, steps up to meeting those requirements.
Using business intelligence and analytics to guide people and applications so that
they continuously know the best action to take and when to take it in every
business process activity. It is about dynamically using BI to keep a business
running optimally while remaining compliant, minimising risk and maximizing
profitability
Starting down the road to implementing active intelligence signals a fundamental
change in the way you intend to use BI systems. The intention of Active
Intelligence (AI) is to make it possible for everyone to work smarter with far more
people in the organisation being guided by insights to help them contribute to
bottom line performance. In addition, the insights provided also need to guide
people to take actions that all contribute towards achieving targets and objectives
set out in a common business strategy. This can be achieved by improving
precision of BI delivery so that people get role-based, relevant intelligence in the
context of every task they perform, as and when each task is performed. It must
also cater for people who are mobile whether they are employees, suppliers,
partners or customers. This kind of capability opens up BI to a much larger number
of concurrent users.
But its not just BI for humans. Many self-service systems like web sites,
interactive voice response units at contact centres, bank ATMs, travel kiosks, and
even self-checkouts at stores can be made smarter with active intelligence. In this
case it is applications that need to be guided by insights.
Figure 1 shows the some of the key differences between passive and active
intelligence. It shows that active intelligence encompasses passive back-office use
but adds additional capability to empower people and systems in business
operations.
Active intelligence
introduces near realtime data, on-demand
BI automated analysis,
recommendations
alerts and automated
actions
Passive Intelligence
Used by business analysts, managers
and executives
Active Intelligence
Used by business analysts, managers
and executives AND front-line
operations staff, partners and suppliers
Historical data
Figure 1
Active intelligence can
drive actions at all
levels of the business
to co-ordinate
execution of a
common business
strategy
Looking at Figure 1 it is clear that if BI systems are to become active they need to
take on new characteristics over and above what you would typically find in a
traditional passive BI set-up. Two key active characteristics that stand out are the
support for on-demand and event-driven use of BI. In addition, Active intelligence
systems also introduce the use of near real-time data, automated analysis and
automated actions rather than relying entirely on the need for human analysis
before making decisions. Lets explore each of these active characteristics in more
detail to understand why every business needs them to provide the majority of their
employees who work in operational areas with the insights they need.
One of the road-blocks to the use of BI in business operations in the past has been
the inability to make use of BI because many people in front line business
operations are in job functions where there is no linkage to BI. A good example
here would be a contact centre operator, a bank teller or a point of sale operator in
a retail store. In many cases these front-line workers are constantly tied to specific
operational applications that they use to do their jobs.
This problem goes beyond people. It also extends to front-line applications in
business operations where there are no employees involved. Examples here
include e-commerce applications, airline kiosks, and other self-service applications
that could be customer- or supplier-facing. Many of these applications are also
accessible via mobile devices. In this case these applications allow customers,
partners or suppliers to interact and transact business as part of self-service
operations.
Any organisation looking to introduce smart operations should not take the no
access to BI tools problem to mean that they cant leverage intelligence or
analytics to guide people and applications in front-line operations. It simply means
that there needs to be another way to do this.
On-Demand Intelligence
Introducing BI services
opens up the way to
make BI available to
applications on an ondemand basis
Modern BI platforms
support BI services outof-the-box
That way is to design and deploy BI services so that operational applications in use
by front-line workers or by customers (as self-service applications) can request the
appropriate BI on-demand. So for example, a contact centre agent entering a
customer name into a customer service application gets back not just account
information, but also other valuable BI insights and context like lifetime value of
the customer, recent purchases, and any recent service interruptions. Equally a
self-service on-line insurance quote application could request customer and risk
intelligence on-demand so that the pricing engine can leverage specific intelligence
about a customer (or similar customers) or claims to more accurately calculate a
price before displaying an on-line premium quote. Fortunately today, modern BI
platforms make on-demand access possible by supporting BI services out-of-thebox. This allows reports, queries, and analyses to be published as web services for
subsequent on-demand invocation (in an industry standard way) from any
operational application, processes or portal.
On-Demand Recommendations
Another form of active intelligence is the on-demand recommendation. This is an
online request for an automated decision to guide someone in operations. It is
different from on-demand intelligence because it requires automated analysis of
specific data and an automated decision based on the intelligence produced by
that analysis. Therefore, services need to exist that will analyse specific data and
use rules to decide what to recommend. A good example here is an on-line retail ecommerce application where a customers data is analysed to produce a cross-sell
or up-sell recommendation while the customer is online. Another example is an
accept/decline recommendation (based on a customer risk score) to a customer
advisor dealing with a loan application in a branch of a retail bank. On-demand
recommendations guide people and keep decisions within tolerance limits.
People cannot be
expected to spot every
problem
Event-driven active
intelligence is about
automatically detecting,
analysing and if
necessary acting on
events to keep the
business optimised
The issue with event processing is that in many cases it cannot be done manually.
This is especially true if action is required immediately, if the volume of events is
very large (e.g., financial markets) or if event correlations are very complex to
identify. Also, certain conditions may need to be true for a specific combination of
events to be deemed important. For example if six different events all occur within
a certain timeframe (e.g., the last 20 minutes) then action is needed but otherwise
it is not.
In some industries, the volumes of events can be significant. For example ebusiness web logs on very heavily used web sites can hold millions of mouse clicks
as they record behaviour of every user on every page on the site. That can amount
to terabytes of data per day. Sensor data is another example of high volume event
data. Even though the use of this technology is still in its infancy, sensor networks
are increasingly being used to instrument business operations so that
organisations can see what is happening in specific parts of their business where
they had no insight before, e.g., in a supply chain. This allows them to improve
these operational areas and respond if problems are detected in the process.
Today there are sensors in mobile phones, on manufacturing production lines, in oil
pipelines, in buildings, on utility grids, in cars, on white goods, and on products to
track their movement. As instrumentation is deployed in more areas of operation,
the volume of event data being emitted by sensors (e.g., RFIDs) continues to grow
into hundreds of terabytes or even petabytes in some cases. Events involving
unstructured or semi-structured data are also starting to be monitored, such as
tweets on Twitter. Of course not all sensor data needs to necessarily be stored.
Only if a pattern deviates from the norm might the data need to be persisted.
Nevertheless, the volumes of data can be considerable.
Active intelligence
systems can leverage
technologies such as
Hadoop when dealing
with multi-structured big
data sources
These new big data sources are opening up new challenges especially around
semi-structured and unstructured data types where more complex analytical
constructs have emerged to walk web logs, analyse social graphs, etc. Given the
volumes of multi-structured data, there is a need to run analytics to pick out
patterns in parallel. This problem is now being addressed by technologies such as
Hadoop. Hadoop can leverage thousands of servers to store big data volumes
which can then be analysed in batch using Hadoop Map/Reduce programs. In
some products, it is also possible to store multi-structured data types and invoke
Map/Reduce analytical functions as user-defined functions via SQL. This allows BI
tools and SQL developers to exploit the power of thousands of servers to analyse
and report on big data. Massively parallel RDBMSs and Hadoop can both be part
of an active intelligence system.
A key difference for active intelligence systems is the ability to capture and react to
operational data in near real-time. Near real time data is needed so that
organisations can act much more quickly when problems and opportunities occur.
Near real-time data can be pushed to an active BI system or pulled. Information is
pushed when an application puts the required data in a message on an enterprise
service bus as soon as a transaction occurs. The ESB then routes the data to the
active BI system as opposed to using traditional batch ETL. This is particularly
important for event-driven analysis. Listeners can pick up these messages and
load the data into a DBMS for analysis. It is common to see event listeners in
complex event processing technology where automated analysis and automated
actions on that event data can occur. Alternatively an event message on an ESB
can trigger event-driven data integration to pull data from one or more sources
every time an event occurs. Pulling data in near real-time can also be achieved via
micro-batch extract which could be scheduled to happen at frequent intervals.
AUTOMATED ANALYSIS
Automated analysis is
needed in event
processing and in
recommendation
services
Using predictive and
statistical models to
analyse data is one way
to implement automated
analysis
With the speed of business increasing and the number of data sources feeding BI
systems also increasing, the number of business events that need to be detected
and acted upon is also on the rise. It is therefore not practical in most cases to
expect business analysts using traditional BI tools to manually analyse all data to
identify every problem and every opportunity. In many cases today, it would be
preferable to be able to analyse data automatically,. Complementing human-led
analysis with automated analysis makes sense in a lot of operational and
managerial areas. It allows people to start managing by exception while delegating
some analyses to software. The use of predictive and statistical models to
automatically analyse data is one way in which to make this possible. Power users
who build these models can deploy them to constantly and automatically analyse
data either on an event-driven basis or on a timer-driven basis. Automated analysis
using statistical and predictive models is particularly effective in business
operations but it is not limited to just operational areas. It is also needed to
AUTOMATED ACTIONS
Automated actions can
be implemented using
rules
Automated actions are automatic decisions. An active intelligence system uses this
capability to trigger alerts, to automatically invoke transactions in operational
applications or even to invoke whole business processes. Generally speaking, this
is most effective if used to drive automated actions (e.g., invoke transactions) when
the most common problems occur and to alert people when exceptions occur that
need to be dealt with manually. Rules are needed to make automated decisions
and to trigger automated actions. Therefore a rules engine is an important
component of an active intelligence system.
The combination of automated analysis and automated actions is needed to
support another unique characteristic of an active intelligence system. That
characteristic is on-demand recommendations, i.e., to automatically analyse data
and then make a recommendation decision based on the outcome of the analysis.
The same combination is needed for event-driven analytics to automatically
analyse the significance of an event correlation and to automatically take action as
soon as possible after the business condition is determined. For example, a surge
in orders may have a major impact on a manufacturing schedule and materials
inventory requiring action to accommodate the change, (for example, more
materials may need to be ordered, other orders put on hold, shipping may need to
change, etc.) Similarly, scheduled automated analysis of customer and account
data in a retail bank may detect that a customer has a lot of money just sitting in a
checking account that could earn better interest in a savings account. This is
automatic pro-active analysis and decision making.
10
Active intelligence
therefore needs to be
deployed enterprise-wide
to maximize business
benefit
to help guide them. Therefore, normal process execution may involve on-demand
BI and recommendations while event-detection, automatic analysis and automated
actions such as alerts may trigger even more use of these services in different
parts of the enterprise. Active intelligence is not restricted to one part of the
business. Its uptake becomes enterprise-wide. This is especially important when
trying to co-ordinate different parts of the business so all contribute to common
objectives.
Finally (but now shown), active dashboards provide role-based views into the realtime activities enabling managers in different parts of the business to get early
warnings, see trends in each functional area and get KPI-rollups for entire end-end
processes. And of course, all of these new process-oriented active intelligence
operations occur while the traditional use of BI and analytics also continues in its
normal way, with business analysts and managers accessing analytical databases
using BI platform tools such as ad hoc reporting and on-line analytical processing
(OLAP).
11
Managing a fast
moving supply chain
requires access to
near real-time
information so that
people can act quickly
to resolve problems
The first example is in the area of supply chain management. Keeping a supply
chain optimized is a very challenging task, especially when things can change
rapidly. The faster moving the supply chain, the more challenging the task to keep
it running smoothly while minimizing cost.
One of the fastest supply chains involves newspapers.. Eight hours from point of
product manufacture to point of sale. So many things can happen in a fast moving
supply chain that can impact:
Human resources
Packing allocations for outbound distribution
Goods-in processing
Distribution centre inventory management
Packaging requirements for distribution
Correct, complete and on-time deliveries
Correct invoicing
Correct delivery documentation
The need to do delivery re-runs if they are wrong
Customer satisfaction
Operational costs
Profitability
Any kind of event in a fast-moving supply chain like this has to be monitored mainly
because there is often very little time to react while remaining within service level
agreements. To guarantee smooth-running operations requires continuous
observation of the supply chain and related events. Therefore the logistics
operation in each distribution centre needs access to on-demand BI on near realtime data and also needs insight about events that could impact operations.
To simplify consumption of information in such a time-constrained business means
that data needs to be integrated in near real-time into a data warehouse so that it
can be interpreted quickly and acted upon if necessary. Lets drill into the printing
and distribution part of the overall supply chain. This data comes from core
operational data sources including:
Publisher data feeds
Order entry system (to see demand changes and spikes)
Distribution allocation systems
Distribution centre inventory management
Goods-in
Claims management
Customer service
Returns
A combination of active intelligence, near real-time data integration and eventdetection is needed to keep distribution centres aware of all changes as they
happen.
12
Event processing is
also used to identify
cost saving
opportunities
With active intelligence, they went further. Adding event processing allowed
automatic monitoring of expenditure against budget and cash flow providing the
ability to monitor spending on a continuous, real-time basis. Using event
monitoring, they monitor purchase requests across the business looking for
opportunities to save money. The company can, for example, monitor to see if
several purchase requests have been detected within a set period (e.g., the last
20-minutes). Correlation of multiple events in this case indicates that several
requests are for materials from the same supplier. Automated analysis spots a
discount opportunity if these purchases are batched together and automated action
causes alerts to a procurement manager to take action.
13
Automated analysis
and automated actions
makes automated
quote management
possible
Now consider the same insurance company trying to expand into the mid-market.
The impact of this is that it will be inviting a much greater volume of inbound
property quote requests from a larger broker network and/or prospects. To do this,
it cannot afford to hire large numbers of underwriters. Therefore the underwriting
decision process needs to be automated to handle much larger volumes of
inbound quote requests coming in via online applications or via electronic
messages from more brokers.
In this case, the insurance company makes use of active intelligence automated
analysis and rules-based automated actions to create rating (pricing) decision
services with underwriting expertise represented in the decision rules used by each
service. The rating decision service repeatedly makes use of automated analysis
and rules to improve the pricing accuracy and automate underwriting decisions
every time a quote request occurs.
In addition, the same rating decision services are used to guide underwriters as to
the correct premium price to minimise risk or to recommend re-insurance if a
property risk borders on the uncomfortable side.
These are examples of manual and automated insurance underwriting decisions
being guided by intelligence every time. Also the ability to automate underwriting
decisions laid the foundation to go more into commercial insurance lines of
business where automated quote management was needed.
14
Accessing business
analytics in a SOA
helps organisations run
smarter and improve
effectiveness
Continuous business
optimisation is also
possible
Figure 3
On-demand requests for BI and recommendations can be made by operational
applications, executing business processes, portals, CPM scorecards, dashboards,
office applications and search engines - all accessible from a browser or mobile
device. Monitoring of real-time event streams is also possible. This can be
integrated into role based dashboards alongside historical data to allow managers
to see what is happening over time as well as what is happening now.
This change in architecture to position BI at the centre of the enterprise closes the
loop with operational systems, making business insights accessible to everyone
leveraging common BI services. In addition, it provides the capability to monitor live
events, undertake traditional data warehouse analysis and reporting, deploy
multiple DW appliances for specific projects and analyse large amounts of data in
15
Finally, given that there can be multiple different types of analytical data store in an
active analytical environment, it should be possible to move analytical workloads
between these so as to match the workload to the appropriate technology. The
focus should be on the analysis that needs to be done and not the underlying data
store. Therefore workload management needs to seamlessly manage analytical
workloads across analytical appliances, data warehouses and ultimately Hadoop
MapReduce platforms irrespective of whether they are event-driven workloads, ondemand operational BI workloads or traditional analysis and reporting or complex
analysis on large volumes of data.
There are many more requirements that are part of an active intelligence system. A
complete set of requirements is discussed below.
16
An active intelligence
system must be capable
of managing large
numbers of concurrent
users and offer high
availablility
17
Figure 4
An active intelligence
solution includes event
processing
or by
o
18
Automated analysis
should also be capable of
being scheduled as this
allows conditions in
historical data to also be
automatically detected
Workload management is
also a key requirement to
manage operational BI
and traditional analytical
workloads
A common data
management platform
supplying clean,
integrated trusted data is
fundamental to success
19
Teradata was founded in 1979 and manufactures the Teradata massively parallel
relational DBMS which runs on an optimized hardware solution assembled from
industry standard technology from Intel and NetApp. The Teradata Purpose-Built
Platform Family includes several products that span customer database size,
concurrency and performance needs. These are:
The Teradata Data Mart Appliance - An entry-level Teradata database
appliance for production data warehousing and data marts with up to
5.8TB or 12TB disk storage
The Teradata Extreme Data Appliance A Teradata database appliance
aimed at complex analytical workloads on large amounts of data. It scales
from 45TB up to 196PB of storage with 4096 nodes
In March 2011, Teradata acquired Aster Data, which offers the Aster Data nCluster
analytic platform. This is a massively parallel relational database solution that is
capable of embedding Hadoop MapReduce analytic application logic within the
Aster Data nCluster for big data analytics on multi-structured data sources. It runs
SQL-MapReduce analytic application logic inside the Aster Data MPP system, for
analysis of massive data sets. To speed up development, Aster Data also provides
a pre-built suite of optimized SQL-MapReduce analytic modules known as the
Aster Data Analytic Foundation and a visual development environment known as
Aster Data Developer Express to exploit the Analytic Foundation and generate
MapReduce analytic modules. SQL-MapReduce analytic application logic can be
20
Figure 5
Insights discovered in semi-structured data on Aster Data can be fed into the
Teradata Active EDW for integration with traditional data to increase the
effectiveness of decision making. For example, in a Telco, customers churning
because of bad network experiences may influence others to churn. To minimise
churn, Aster Data can be used to identify clusters of callers where one individual
leads the way on churning behaviour and influences others. With this insight
loaded into the Teradata EDW, marketing campaigns can be launched to quickly
turn around potential defectors and their followers.
21
Active Access
Parameterised queries
and join indexes are
particularly well suited to
very specific on-demand
BI requests
This is the ability to handle concurrent queries coming from on-demand BI and
recommendation services being invoked by operational applications, processes
and portals. These BI services sit in a service-oriented BI platform on top of the
Teradata DBMS. To cater for an increase in concurrent users invoking operational
BI services on-demand, join indexes and parameterized queries can be created on
the Teradata DBMS. Join indexes help retrieve frequently used data without
needing to join tables in real time. Instead, pre-computed answers can be stored
and accessed quickly. Parameterized queries allow the Teradata optimizer to
cache SQL it has seen before and reuse the execution plan the next time it sees
the same SQL. This means that popular BI services (such as those invoked by
contact centre representatives) may be turned around quickly.
Active Load
Event-driven trickle feed,
micro-batch and change
data capture help get data
into the Teradata DBMS
quickly
Given that Teradata does not provide a CEP engine for event-processing, it needs
to support another way of handling events. That way is based on the ability to get
data into the Teradata DBMS as close to real-time as possible. Teradata Active
Load is the mechanism for doing this. Using Active Load, near real-time data can
be loaded into the Teradata DBMS from a messaging backbone (for example, from
JMS message queuing software), via mini-batch and also via change data capture.
Teradata Parallel Transporter caters for streaming messages and mini-batch while
Teradata Replication Services (via Oracle GoldenGate) handles change data
capture. In the case of messaging, changes to operational systems can be posted
to message queues on middleware such as IBM WebSphereMQ. Teradata Parallel
Transporter then reads the message queue(s) and directly updates the Teradata
DBMS. Note that queries can still access Teradata table structures while they are
being updated by Teradata Parallel Transporter.
22
Active Events
Teradatas partnership
with SAS makes indatabase automated
analysis of events
possible
In addition to near real-time data coming into the Teradata DBMS via Active Load,
event-processing needs to trigger automated analysis to analyse that data. As data
is loaded into the Teradata DBMS, database triggers can fire and invoke analytics
on data in the DW. Support for automated analysis is taken care of by the Teradata
partnership with SAS Institute. Statistical and predictive models developed in SAS
can be deployed alongside the Teradata DBMS and executed in parallel to analyse
detailed data as shown in Figure 4. In-database analytics pushes automated
analysis as close to the data as possible which is an important performance feature
in an event-driven operational environment. Data or events can then be inserted
into a queue table which fires a trigger to drive appropriate actions like sending
alerts to users and invoking transaction services to keep the business optimised.
Events can also be analysed outside of the Teradata environment by Complex
Event Processing engines in real-time or by Aster Data off-line. The reason for
doing the latter is to analyse event data to determine if event patterns constantly
reoccur over time. The use of Aster Data nCluster is particularly compelling in this
regard especially in environments with very large numbers of events. Sensor
event data is a good example. This big data source can be loaded into Hadoop
and analysed using SQL- MapReduce logic built into the Aster Data nCluster MPP
DBMS. Event patterns that are of interest could then be passed into the Teradata
Active EDW to be combined with other data to determine what action should be
taken to prevent operational disruption or unplanned operational cost from
continually reoccurring.
To cater for the much larger numbers of concurrent user requests for on-demand
BI, on-demand recommendation, near real-time event-driven analysis, as well as
traditional BI usage and data loading, it is important to be able to balance
workloads so that the system continues to satisfy the needs of all users. That
means being able to fence off resources for some workloads and dynamically
change priorities at peak times while continuously monitoring workloads. To cater
for this requirement, Teradata offers a set of Active System Management tools.
They include the Teradata Workload Analyzer, which analyses Teradata Database
user logs and system tables to profile actual usage behaviour and analyse
workloads over time. Teradata Workload Analyzer recommends workload groups
and parameters which can then be established in Teradata Dynamic Workload
Manager as workgroup categories and control settings. Also workloads can be
prioritized by time or by user group. Dynamic Workload Manager then
continuously monitors resources at run time. So for example, on-demand
operational BI services could be separated from traditional complex analyses and
loading to give them a high priority.
A key part of integrating with business operations is the ability to plug into a
service-oriented architecture (SOA). We have already seen that BI platforms
accessing Teradata and Aster Data can publish BI services for invocation by
operational applications and processes. The Teradata DBMS itself can also exploit
ESB and BPM software. The Teradata Parallel Transporter can capture process
event messages streaming over an ESB (running on top on a JMS messaging
middleware) to trigger event-driven automated analysis using SAS models
deployed in the database. In addition, triggers in the database that fire to carry out
actions can also send requests over an ESB to invoke transaction services and
whole processes as part of an automated action. In addition, Eclipse IDE
developers can also create custom BI services using the Teradata Eclipse plug-in.
23
Active Availability
High availability means
Teradata can easily
accommodate service
levels imposed by
operational systems need
to access BI on-demand
24
CONCLUSION
Integration into business
operations is now a
strategic requirement for
BI systems
Organisations have to
undertake some
business analysis to
understand how they
operate to get maximum
value from an active
intelligence
implementation
Key infrastructure
software also needs to
be integrated
To conclude, organisations are now looking to work smarter in all corners of their
business-- from executives to front-line workers-- to improve strategic, tactical and
now operational decisions. To make that happen the BI systems are moving to the
centre so insights can be integrated into every business process. The intent is to
deliver role-based contextual intelligence in the context of every operational task.
The new operational intelligence workloads can run on the same BI infrastructure
that supports traditional BI processing. We are also starting to automate the
monitoring of internal and external events to keep a finger on the pulse of the
business as processes execute.
To compete, organisations have to become active to make sure that people are
always aware of events going on around them and able to make informed
decisions. To get there it is important to
Understand your processes
Understand the roles of people who participate in those processes such as
customer facing contact centre operators, bank tellers, store managers,
salespersons, etc.
Understand the activities (tasks) they perform
Understand the applications they use to perform the activities in a process
Determine the relevant BI and/or actions needed in each process activity
such as alerts and on-demand recommendations
Identify the correct strategy for integrating BI to fit with the user needs,
e.g., portlets that display information needed by contact centre agents
Create required BI web services and integrate them into business
processes to guide operational activity
Create an inventory of events and identify which ones are worth monitoring
Connect data integration tools or DBMS utilities to your ESB to capture live
events as they happen in operational systems
Deploy predictive models in your analytical databases and/or CEP
technology to automatically analyse data in near realtime
Integrate BI, BPM and event-processing into role-based dashboards and
scorecards
Integrate active dashboards and alerts with collaborative workspaces and
mobile devices
There is no question that Teradata has already recognised the importance of active
intelligence and has added the functionality to the Teradata DBMS and hardware
platform family to allow it to easily cope with large numbers of concurrent requests
for operational BI. Hardware advances like SSDs in the Teradata Extreme
Performance Appliance and the Teradata Active Enterprise Data Warehouse with
Teradata Virtual Storage, as well as workload management and high availability all
make it fit for purpose in this much more agile and responsive environment. The
SAS partnership also makes automated analysis possible in servicing requests for
on-demand recommendations or for analysing the business impact of events.
Finally, Aster Data has added another string to their bow to analyse event data
from big data sources such as sensor networks, web logs and social networks. All
this, plus integration with other infrastructure, makes Teradata a strong competitor
to sit front and centre in an always on intelligent enterprise.
25
Author
Mike Ferguson is Managing Director of Intelligent Business Strategies Limited. As
an analyst and consultant he specializes in business intelligence and enterprise
business integration. With over 30 years of IT experience, Mike has consulted for
dozens of companies on business intelligence, enterprise architecture, business
integration and data management. He has spoken at events all over the world and
written numerous articles. Mike is a resident expert on the Business Intelligence
Network, providing articles, blogs and his insights on the industry. Formerly he
was a principal and co-founder of Codd and Date Europe Limited the inventors of
the Relational Model, a Chief Architect at Teradata on the Teradata DBMS and
European Managing Director of Database Associates. He teaches popular master
classes in Business Intelligence, Enterprise Data Governance, Master Data
Management, and Enterprise Business Integration.
INTELLIGENT
BUSINESS
STRATEGIES
26