Você está na página 1de 6

Active Data Warehouse: Review, Challenges

and Issues
Jalel Eddine Hajlaoui*1
* Laboratory of MIRACL FSEG, University of Sfax-Tunisia
1
Higher Institute of Applied Science and Technology,
University of Sousse-Tunisia
hajlaouijalel.ig@gmail.com

AbstractThe conventional data warehouse (DW) is dedicated


to storage historical data. It is not outfitted with automatic
analysis mechanisms and cannot offer dynamic OLAP reporting.
Repetitive analytical requirements of decision-makers could be
formalized and stored into the decision system as analysis
scenarios regarding events, conditions and action rules. In this
way, reactive analytic capabilities of the DW against events
occurring are empowered. In this paper, we overview the state of
the art for the active DW field and we address a further issue
towards the automation of these analyses. More precisely, we
envisage a reusable solution relying on designing patterns for
repetitive OLAP scenarios. It is for assisting decision-makers not
starting from scratch facing a well-known decisional problem.
Index Terms Active data warehouse, Analysis Rule, OLAP
analysis scenario, OLAP analysis pattern

I. INTRODUCTION

RADITIONALLY, a data warehouse is defined as a


centralized structure in which is stored a large amount of
historical data, organized by topic and consolidated from
various sources of information [1]. Hence, a data warehouse
(DW) can be viewed as a large repository of historical data
organized in a pertinent way in order to perform analyses and
to extract interesting information through OLAP (On-line
Analytical Processing) technology. DW and OLAP technology
have proven to be eective tools for decision support systems
in multidimensional context. Specially designed to support onthe-fly hypothesis-driven exploration of data, OLAP systems
are commonly used as reporting tools in almost every
application for business intelligence [2]. However, they have a
main drawback: they do not provide the decision-maker with
mechanisms for automating repetitive analyses and, therefore
cannot offer dynamic OLAP reporting. OLAP analyses are
interactive explorations of the DW data through a set of
navigational operators like Drill-down, Roll-up, Slice and
Dice. Analyzing a data cube interactively, by means of these
operators, is a practical task that has often been admitted as
requiring certain skills: organizational skills and practice
know-how in making decisions.

Nesrine Hamdani
Higher Institute of Applied Science and Technology
University of Sousse-Tunisia
nesrine.hamdani@gmail.com

Typically, the stakeholders of OLAP operations are mainly


business analysts whose task is to interpret generated reports
containing consolidated numerical statistics. Nevertheless, the
difficulties of analysis discovery of OLAP cubes arise when
data dimensionality increases. As a result, user exploration
become effortful, complicated and time consuming.
Thus, computer techniques easing this exploration and
extraction of pertinent information are needed. User-centric
techniques, namely personalisation and recommendation have
attracted much interest of researchers from data warehouse
engineering field. Much works have been invested in terms of
OLAP query formulation and recommendation (see for
example [3], [4] [5]) but not in view of habitual analysis tasks.
Yet, in a competitive environment, end users need other
than considering their preferences in query answering or
recommending them relevant queries that comply with their
tastes, reactive decision systems able to automate reporting
throughput, maintain dynamic analysis, and improve the user
control over the analytical processing [6]. In order to introduce
dynamics in DWs and multidimensional analyses, it is
fundamental to integrate analysis rules into the decision
system regarding events, conditions and actions that can
express usual analyses needs of decision-makers. In this
context, decision making can be automated for routine
decision tasks, through a novel architecture called Active Data
Warehouse (ADW) introduced in [7]. ADW extends
conventional ECA (Event-Condition-Action) rules, already
recognized in active databases [8], with multidimensional
features for analyzing the DW data and making decisions.
These extended rules are called analysis rules since they
emulate the way an analyst inspects multidimensional data.
We notice that these analyses rules, even they were studied in
the literature, they are intended to be described manually. In
other terms, no efforts have been made so far to construct
general skeleton able to be adapted for a given recurrent
problem: a pattern for routine OLAP analyses.
The remainder of this paper is organized as follows:
Section 2 introduces the background basics related to active
processing of data. Section 3 overviews a state of the art about
ADWs. Section 3 deals with the incorporation of analysis
rules in the data warehousing process. Section 4 discusses the
event processing in an ADW. Finally, Section 5 concludes the
paper and gives an overview about the sequel of our future
work aiming to propose an approach for building and reuse of
patterns of OLAP analysis scenarios.

II. ACTIVE PROCESSING OF DATA


The active processing of data was initially introduced in
the database field leading to active databases. In an active
database, the occurring of an event triggers a predefined
processing associated to that triggering-event. Therefore, the
database presents a reactive behavior in the sense it is able to
detect certain situations and perform corresponding user
defined actions through inserting, updating or deleting
operations on its content. This reactive behavior is typically
specified in terms of Event-Condition-Action (ECA) rules.
Similarly, an Active DW aims to acquire this feature. It is a
DW that is empowered with reactive analytic capabilities
through an active behavior initiated by means of analysis rules
and triggers. These triggers can detect events in order to
automate the decision making that had to be predicted by the
expert users. For instance, the Rules Manager [9], which is
included in the Oracle 10g database management system
implements the use of triggers using ECA rules with the IF
THEN rule language terminology to execute treatments when
the triggering event occurs within the operational source. In
contrast, none of the current Data Base Management Systems
(DBMS) provides triggers for data warehouses. As well,
commercial data warehouse tools offer very limited
capabilities implementing features of active data processing.
Excepting, the commercially available product Comshare
Decision [10] is a tool allowing proactive alert exception by
detecting outliers based on declarative conditions specified
over OLAP cubes and attracting the attention of business
analysts to these outliers. Unlike active data warehouse which
use event-driven mechanisms, Comshare Decision is not
event-based. Moreover, it does not stand a closed-loop
approach to immediate trigger actions in operational systems,
whereas this is one principal characteristic of active data
warehouses.
III. RELATED WORKS
In the literature, we distinguish two categories of
approaches related to active processing of DW data. The first
category is interested in the real time data processing, while
the second focuses on the analysis of the data warehousing
process.
Among the first approaches, Nguyen and Min [11] have
defined a framework for a zero latency data warehousing
(ZLDWH) which combines the continuous data integration
and active rules based on ECA techniques to minimize the
integration time of data into the DW. In a different way, while
ZLDW and ADW approaches use active rules for automation
of routine analysis tasks, the approach of Salem et al., [12]
employs active rules along with event mining to automate and
reactivate data integration. The proposed framework has
reactive and autonomic capabilities by warehousing events
about framework activities, mining such events, and then
following ECA rules mechanism to reactivate integration tasks.
Integrated data are stored into unfied AXML repository to
support complex analysis queries. In [13] the authors present
an active real-time system framework based on multi-agent
system to improve the active and real time performance of the
data warehouse and enhance its scalability. The issue of
performance optimization of analysis rules in real-time active

data warehouses is treated by [14]. In this work, analysis rules


are divided into two types, namely, real-time analysis rules and
non-real-time analysis rules. The LADE (Log data mining
based Active Decision Engine) system is developed to get all
the reference information required by optimization work, and a
new algorithm, called ARPO (Analysis Rule Performance
Optimization), is proposed to carry out the optimization work
based on the reference information. Authors define rush hour
and frequent cubes for real-time analysis rules, and cubes using
pattern for non-real-time analysis rules.
The increased need for data preparation mechanisms in a
dynamic and evolving data warehousing environment is
adressed by Ezekiel and Marir [15]. They introduce a trigger
cleaning approach where triggers are defined to enhance the
data cleaning process. Defined Triggers are coupled with
dynamic capabilities according to the Dynamic Object Model
(DOM). Their components (Event, Condition, and Action) are
elucidated using the framework specification presented in
[16]. This approach is outlined in two steps namely the trigger
processing and the data processing. The former implies
semantic data preparation and provides the design and
manipulation of triggers and their components while the latter
connotes syntactic data preparation to emphasize the reactive
behavior of the data cleaning process.
In the second category of approaches, Zwick et al., [16]
propose to implement ADW architecture for automated
analyses based on workflow technology. Here, an analysis can
be perceived as a directed graph where each node represents a
partial analysis and each edge a condition which connects
subsequent analyses. A transition between two connected
nodes (i.e. partial analysis) is performed, if the corresponding
condition holds with the three analysis primitives:
AnalysisStep to specify OLAP analysis, AnalysisLoop to
perform further analysis steps in the loop and Action to
implement the closed loop via notifications or source system
transactions. The most important component of the proposed
architecture is an OLAP recorder which records all OLAP
queries executed by the user with an ad-hoc analyses tool. It
builds up a query history which can be used to automatically
generate analysis graphs. These graphs are available to the
analysts by implementing them based on workflow engines.
The work of Thalammer et al, [17] [18] extend the notion of
ECA rules with mechanisms to automate repetitive analysis
and decision tasks in data warehouses. Each analysis rule is
specified for a devoted dimension level, the primary
dimension level of the analysis rule corresponding to an entity
in the operational system (OLTP system). The rule will be
fired for a primary dimension level element and the triggered
action will be bound to the related OLTP object of the primary
dimension level. The analysis rules follow a multidimensional
analysis using the concept of analysis graphs. These rules
represent a conceptual structure involving multidimensional
views, or data cubes [19]. They are implemented by triggers
and SQL within an Oracle database. Their construction is not
automated and requires the user intervention with high
knowledge of SQL and the underlying framework.
Bouattour et al, 2011 [20] define a new formal framework
for the analysis rules. By using a multidimensional model and
an existing OLAP algebra that is enriched with binary
operations on cubes [21], authors have generalized the analysis

rules by transforming the ECA formalism to the ECG (EventConditions-Graph of analysis) mechanism. Herein, an XML
design of analysis rules is proposed to model both the logical
and the physical level of analysis rules. Hence, analysis rules
are stored in the decision system in XML documents. This
contribution lacks an implementation of the overall concepts of
the ECG.
Olegas and Smaizys, [22] argue the use of business rules
for supporting active data analysis and automation of business
decisions making. They generate executable Multi
Dimensional eXpression (MDX) instructions from specified
business rules via XML transformations. Such MDX
instructions are used to define multidimensional data selection
in Microsoft OLAP API software systems related to the
knowledge represented as business rules and facts discovered
from current business data. OLAP Cubes are generated
dynamically by selecting new dimensions and performing drill
downs according to the business situation evaluated by
assessment of current business data (facts) conforming to the
business policy (business rules). Remaining unsolved problems
of incomplete business rule set resolution and business rule set
transformation schema design is a main shortcoming of this
proposal.
TABLE I.

COMPARAISON OF ADW ANALITYCAL APPROACHES


Thalammer
et al. 2001
[18]

Bouattour
et al.2009
[19]

Olegas
and
Smaizys,
2006 [23]

OLTP
methods

Relative
temporal
events

Calendar
events

Cloosed loop

workflow

EBNF
notations

XML

XML,
XSLT

MDX

SQL

MDX

Event Detection

Zwick et
al. 2006
[17]

Unary OLAP
operators
Binary OLAP
operators
Modeling of
Analysis
Rules
Query
Language
Software
prototype

To recapitulate, we can assert that active data warehouse


loading and integration of analysis rules in real-time active data
warehouse are well treated. However, the incorporation of
analysis rules to actively analyze data through OLAP
operations is not thoroughly studied especially in terms of
automatic construction of analysis rules.
In Table 1, we compare the ADW approaches related to the
data analysis of the data warehousing process. We can claim
that in almost all cases the construction of analysis rules is
performed manually. As an intrinsic consequence, the
specification of a new analysis rule is also a manual task and
often requires high knowledge of the OLAP syntax and the
modelling formalism of analysis rules.

IV. INCORPORATION OF ANALYSES RULES IN A DW PROCESS

Events

Analysis report
Automatic
DBMS execution

GUI

XML

Conditions

Occured
events

Curves

Graph of analysis
Specifying of analysis rules

Execution of analysis rules

Cubes

Fig. 1. Active data warehouse process [20].

The decision-making process starts with the declaration of a


specific scenario corresponding to a business analytical
requirement. Each scenario corresponds to an analysis rule
that must be defined by users. Declared rules should be stored
in the warehouse and processed by the OLAP Server as
queries. Figure 1 shows the general process of an active data
warehouse as proposed in [20].
A. Specification of Analysis Rules
To specify an analysis rule, the user first should define the
associated events with this rule. An event can be a fixed
schedule using a calendar point in time at which analysis and
decision processes can be initiated (e.g., BeginOfWeek) or a
change of state in the production databases (e.g., turnover of
certain products exceeds a threshold value). Then, the user
defines the conditions for analysis scenario. Finally, the user
generates the analysis graph that describes the sequence of
OLAP operations to be performed from the corresponding
analysis rule. Thus, steps of specifying analysis scenario
relative to an analysis rule are: i) specifying of events, ii)
specifying of conditions, iii) specifying of analysis graph.
B. Execution of Analysis Rule
Analysis rules are designed in the XML format for the
logical level as well as the physical level of OLAP scenarios.
Hence, analysis rules are stored in XML documents. Analysis
rules are currently limited to the formulation of navigation
scenarios. As shown in Figure 1, the XML documents are
stored in the data warehouse. They are managed subsequently
by the multidimensional DBMS at the same title then queries
and data. When events occur, once the conditions are satisfied,
the OLAP operations (e.g. Roll up, Drill Down, Slice and dice,
etc) specified in the analysis graph are executed. Analysis
reports are then produced on each scenario execution.
C. XML Modeling of Analysis Rule
The Figure 2 shows a general modeling of analysis rules.
Each analysis rule consists of three main tags: events,
conditions, AnalysisGraph representing respectively
events, conditions and analysis graph. An analysis graph can
be divided into one or more sub-graphs of analysis represented
by the tag SubGraph.

Fig. 3. Example of analysis Path

Thus, an analysis graph defines a conceptual formalism to


represent an analysis process in order to produce scenarios of
automatic analytical reporting. These automatic scenarios give
to the data warehouse an active behavior. From a graphical
point of view, an analysis graph G is schematized by a set of
nodes U (data cubes) and a set of paths P (OLAP operations).
In order to meet the needs of the XML logical modeling,
the notion of an analysis sub-graph is also introduced. An
analysis sub-graph, denoted SG, is a part of an analysis graph
G which satisfies the two following conditions: (i) SG starts
with an input cube in G, or two input cubes of a binary
operation; and (ii) SG ends by an output cube in G, or an
output cube of a binary operation. For example, in the analysis
graph of Figure 5, we distinguish two sub-graphs delimited by
the dotted frames.

Fig. 2. XML modeling of analysis rule.

For example, in the figure 2, the analysis graph consists of


two subgraphs. At each subgraph, a tag input_cubes is
associated to describe input cubes and a tag AnalysisPaths
which represents analysis paths. Each analysis path, which
represents an OLAP operation, is modeled by the tag
SubGraph.
D. Analysis Graph Formalisation
According to [20], an analysis graph G is a conceptual
representation of an analysis scenario. It is of form U, P,
where: (i) U is a non empty set of cubes and (ii) P is a non
empty set of analysis paths indicating OLAP operations. An
analysis path, denoted p, is a function from Un to U where
1n 2. It associates to one or two input data cubes an output
data cube P(U), which results from the input cube(s) according
to an OLAP operation.
According to that definition, an analysis path represents an
OLAP operation. It can represent both unary operations (eg.
nesting an attribute (Nest), pushing a dimension (Push), rolling
up a dimension (RollUp), drilling down a dimension
(DrillDown)); or binary operations (e.g. the union (Union) ; the
intersection (Intersect) ; and the difference (Difference)) of two
data cubes having the same structure. As shown, the analysis
graph of Figure 5, we represent an analysis path as a direct
links that start from the input cubes to the output cube P (U).

E. Motivating example
Our example concerns the analysis of the cube ' VEHICLE
RENTALS ' for which the conceptual star schema is depicted
in figure 4.
VEHICLE
RENTALS
VEHICLE
Immat

Amount
Duration

Brand
Category

TIME
DateLoc
Month

AGENCY

Type
AllVehicles

Code_Ag
City

Quarter
Year
AllTimes

Region
Country
AllAgencies

Fig. 4. Star schema for vehicle rentals

For instance, suppose that a decision maker need to


incorporate an analysis scenario that compares vehicle rentals
in two different cities" Jerba and Bizerte for the month of
January 2013 and determines which one realize better returns
for sport vehicles applying the aggregate function SUM.
To model such analysis rule, we define the corresponding
event, the triggering condition and finally the analysis graph
that shows the OLAP operations to be applied for
implementing such analysis. The corresponding event rule in
our analysis is temporal and indicates the EOM (End Of
Month) to designate the end of the month of January. The rule

SG1

SG2

SG3

Fig. 5. Example of analysis graph.


Fig. 6. XML modelling of analysis graph.

is fired when the total amount exceeds 2500 euros and the
total duration is above 25 days. The relative condition for the
analysis rule is: Year = 2013. The Figure 4 shows the analysis
graph associated with the presented analysis rule.
This rule is composed of three sub-graphs:
The first subgraph SG1 begins with a cube representing
vehicle rentals by City, Year and Brand. This subgraph
follows a sequence of OLAP operations to get a cube that
represents vehicle rentals of the Jerba city along January
2013 by each vehicle type. It will intervene in a binary
operation (UNION) in the subgraph below.
Similarly, the second subgraph SG2 begins with a cube
representing vehicle rentals by City, Quarter and Brand.
This subgraph follows a sequence of OLAP operations to
get a cube that represents vehicle rentals of the Bizerte
city along January 2013 by each vehicle type. It will
intervene in a binary operation (UNION) in the subgraph
below.
The third subgraph SG3 begins, meanwhile, by the two
end cubes of the two previous subgraphs. It provides a
data cube containing vehicle rentals in January 2013. This
cube allows making a comparison of returns for sport
vehicles as specified in the initial analysis scenario.
In our implementation, we use Pentaho Business Analytics
[24] as an OLAP engine for handling the various OLAP
operators specified in our analysis graph. The first task is to
load the XML file storing our analysis graph as depicted in the
figure 6. When the defined event (i.e. the end of month of
January and measures thresholds are exceeded) is detected and
the relative condition (year=2013) is satisfied, our prototype
triggers automatically the sequences of OLAP operators over
the cube VEHICLE RENTALS and generates queries as MDX
instructions as shown in the top right part of the figure 7. In
the left side of this screenshot, the user can consult the
Pentaho analytical view of the cube VEHICLE RENTALS to
check the outputs of the resulted MDX queries.

Fig. 7. Active OLAP queries generation.

IV. EVENT PROCESSING


One of the basic functionalities of an ADW is to detect
events. Thus, the more ADW detects several types of events,
the more is able to respond to a greater variety of real-life
business situations and react accordingly. Looking at existing
ADW approaches, the studied events are fairly simple
reflecting primitive analysis scenarios. In the work of
Thalhammer et al. [18] [19], the authors use an event model
with three kinds of events: (1) OLTP methods (2) relative
temporal events and (3) calendar events. OLTP method events
describe basic event happenings in the data warehouses
sources. Relative temporal events (e.g., 2 weeks after the
change in price of an item) are used to define temporal
distance between such a basic happening and carrying out an
analysis rule. Calendar events represent fixed points in time at
which an analysis rule may be carried out. They generalize
absolute temporal events (e.g., 15 October 2007) and periodic

temporal events (e.g., the end of each quarter). Absolute and


periodic temporal events are globally available, whereas
relative temporal events and OLTP method events are
available locally to the dimension level, for which they
occurred. Also, in the work of Thalhammer et al., the event
model of an ADW is considered more primitive than the event
model of active data base systems.

[3]

[4]

[5]

V. CONCLUSION AN FUTURE DIRECTIONS


In Bouattour et al., 2009 [20], an ADW is based on the
concept of ECG (Event-Condition-Graph of analysis). By
specifying an ECG, implicit knowledge about decisions
procedures will be transformed into explicit scenarios. More
generally, the analyses scenarios can also be used to construct
reusable patterns of analyses scenarios. For these patterns, we
are faced with two significant problems. The first issue
addresses the definition of a catalog data model for patterns of
analysis scenarios. This model will be used to describe
analysis scenarios triggered by events. The second issue is the
proposal of an approach for the construction and reuse of
patterns of analysis scenario. We seek, through this approach
to guide analysts during the OLAP analysis steps of an
organization business processes. This approach should take
into account different decision tasks (routine, semi-routine
and non-routine) defined in Thalammer et al., 2001 [18].
This requires modeling of analysis scenarios and their
automatic triggering within the ADW when the associated
event to a scenario occurs. An automatic or semi-automatic
mechanism of analyses scenarios is to be offered.
More generally, the generalization of analyses scenarios is
also interesting as a new track that will be explored to build
patterns of analysis scenario. In this context, a process of
construction ("design for reuse") and a process of reuse
("design by reuse") of analysis scenarios will also be new
contributions to the field of active processing of data.
Conventionally, decision-makers carry out analysis sessions
using OLAP languages such as MDX queries on data cubes in
order to find solutions to different decision tasks. An OLAP
session is an ordered sequence of correlated queries formulated
by a user on a schema; typically (but not necessarily), each
query in a session is derived from the previous one by applying
an OLAP operator Aligon et al. [24]. Our current investigations
focus on how to build patterns of OLAP analysis scenarios
from OLAP queries previously issued by expert users. Our
goal is to supply such patterns as mechanisms for the
automation of analysis tasks of data cubes especially for users
without high-degree of technical expertise. As a first step, we
are currently investigating a comparative study of functions
for OLAP query similarity to cluster the OLAP log queries
that match an event occurrence, preparing them to be analysed
by a particular data mining algorithm with the ability to derive
a set of association rules that correspond to the most relevant
discovery user query patterns. Secondly, we plan to adopt the
algorithm proposed by [25] to patternize the clustered queries
and built active rules.

[6]

[7]
[8]

[9]

[10]
[11]

[12]

[13]
[14]

[15]
[16]
[17]
[18]
[19]

[20]
[21]
[22]
[23]
[24]

REFERENCES
[1]
[2]

W.H.Inmon, Building the Data Warehouse, John Wiley & Sons. 1996.
J.T.S. Ribeiro and A.J.M.M. Weijters, LNCS 7044, pp. 274283, 2011.

[25]

A. Giacometti, P. Marcel, and E. Negre. Recommending


Multidimensional Queries. In Data Warehousing and Knowledge
Discovery, 11th International Conference, DaWaK 2009, Linz, Austria,
August 31 - September 2, 2009, Proceedings, pages 453-466, 2009.
A. Giacometti, P. Marcel, E. Negre, and A. Soulet. Query
recommendations for OLAP discovery driven analysis. In DOLAP 2009,
ACM 12th International Workshop on Data Warehousing and OLAP,
Hong Kong, China, November 6, 2009, Proceedings, pages 81-88, 2009.
J. Aligon, M. Golfarelli, P. Marcel, S. Rizzi, and E. Turricchia. Mining
preferences from olap query logs for proactive personalization. In J.
Eder, M. Bielikova, and A. M. Tjoa, editors, ADBIS, volume 6909 of
Lecture Notes in Computer Science, pages 84{97. Springer, 2011.
S. Bouattour, R. BenMessaoud, O. BOUSSAID , H. Ben-Abdallah, J.
Feki, A Framework for Active Data Warehouses. Actes de confrence:
Conference: International Arab Conference on Information Technology
(ACIT 08), Hammamet, Tunisie, 2008.
T. Thalhammer, M. Schrefl, and M. Mohania (2001). Data warehouses:
Complementing OLAP with active rules. Data and Knowledge
Engineering 39(3), 241269.
M-M. Joselito, Xiaoou L., Marco A. M. Bentez, R. Aurora Prez, M-A
Oscar, Jos Ramn Corona-Armenta, Jaime Garnica-Gonzlez. A
simulator for active database systems. Ciencia Universitaria -nmero 1,
enero/junio. 2010
Oracle:uRulesuManager,u2006.http://download1uk.oracle.com/docs/cd/
B19306_01/appdev.102/b14288/exprn_part1.htm#CHDBFICB.
Downloaded 2007105117
Comshare Decision TM :
http://wwwdim.uqac.ca/~jrouette/8ASY208/SIAD-TblBord.pdf
T.M. Nguyen, Tjoa, A.M.: Zero-Latency Data Warehousing For
Heterogeneous Data Sources and Continuous Data Streams. In:
Proceedings of the 5th In-ternational Conference on Information and
Web-based Applications Services (ii-WAS'2003), Jakarta, Indonesia,
Austrian Computer Society (OCG) (September 2003) 55-64
R. Salem, O. Boussad, and J. Darmont. An Active XML-based
Framework for Integrating Complex Data SAC12 March 26-30, 2012,
Riva del Garda, Italy. Copyright 2011 ACM 978-1-4503-0857-1/12/03
...$10.00.
F. Yan The Research of Active Data Warehouse Based on Multi-agent
978-1-4577-1964-6/12/$31.00 2012 IEEE
Z. Lin, D. Zhang, ,C. Lin, Y. Lai , and Quan Zou Performance
Optimization of Analysis Rules in Real-time Active Data Warehouses
Q.Z. Sheng et al. (Eds.): AP Web 2012, LNCS 7235, pp. 669676, 2012.
Springer-Verlag Berlin Heidelberg 2012.
K. Ezekiel and F. Marir, 2003. A conceptual model for managing
knowledge represented as triggers in active databases : 4th European
Conference on Knowledge Management, pp. 323-333
M. Zwick, L. Christian and H. Christian. Implementing Automated
Analyses in an Active Data Warehouse Environment Using Workflow
Technology. TEAA 2006: 341-354
T. Thalhammer, M. Schrefl, M. K. Mohania, Active Data Warehouses:
Complementing OLAP with Analysis Rules , Data Knowledge
Engineering, vol. 39, n 3, p. 241-269, 2001
T. Thalammer, Schrefl: Realizing active data warehouses with off-theshelf database technology. Software; practice and experience 32 (12),
1193-1222 (2002)
S. Bouattour, R. Ben Messaoud and O. Boussaid Modlisation de
rgles danalyse ddies aux entrepts de donnes actifs. , Deuxime
Atelier des Systmes Dcisionnels (ASD2007), Sousse, Tunisie.
Octobre, 2007.
S. Bouattour, O. Boussaid, H. Ben-Abdallah and J. Feki. (2009),
Modlisation et analyse dans les entrepts de donnes actifs. Technique
et Science Informatiques TSI, 30(8): 975-994.
R. Ben Messaoud. Couplage de l'analyse en ligne et de la fouille de
donnes pour l'exploration, l'agrgation et l'explication des donnes
complexes. PhD thesis, Universit Lyon2, France, Novembre 2006.
V. Olegas and S. Aidas. International Conference on Computer Systems
and Technologies CompSysTech, 2006
Pentaho, Pentaho Analysis Services : Mondrian Project, Web site
Pentaho Corporation, 2007, http ://mondrian.pentaho.org .
J. Aligon, M. Golfarelli, P. Marcel, S. Rizzi, and E. Turricchia.
Similarity measures for OLAP sessions, 2012.
E.Negre, F. Ravat, O. Teste and R. Tournier. Cold-Start recommender
system problem within a multidimensional data warehouse

Você também pode gostar