Você está na página 1de 57

Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?

id=803&chapterid=449

Enterprise Resourse
Managemane

Study Material-1

2 UNIT 2 ERP AND RELATED


TECHNOLOGIES
UNIT 2

ERP AND RELATED TECHNOLOGIES


--Saurabh Shukla

x
Please Ask
( कृपया पूछ )

1 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449

Objective
In this unit, we shall cover the following topics in detail:
a) Concept of Business Process Reengineering
b) Role of information technology & Impact of BPR on
organizational performance
c) Tools to support BPR & Benefits to Business
organization
d) Meaning of 'Management Information Systems (MIS)
& Risks Associated With MIS
e) MIS reviews
f) Decision Support System (DSS) and its applications
g) Taxonomies & History of DSS
h) Architecture of DSS
i) Characteristics and Capabilities of DSS
j) Meaning & scope of Executive Information System
k) Contents of EIS
l) Characteristics of Successful EIS Implementations
m)  Information Sharing Vs Information Hoarding
n) EIS Design, Prototyping & Evaluation
o) Advantages and disadvantages of EIS
p) Data warehousing and its applications
q) Data Warehouse Design and Creation
r) Multi-dimensional Analysis Tools
s) History of data warehousing
t) Advantages of data warehousing & its limitations
u) Concept of DATA MINING and its applications
v)   Technological infrastructure required for Data
Mining
w)  Meaning of OLAP, MOLAP, HOLAP and its advantages

Business Process Reengineering


Davenport & Short (1990) define business process as "a set of logically
related tasks performed to achieve a defined business outcome." A
process is "a structured, measured set of activities designed to
produce a specified output for a particular customer or market. It
implies a strong emphasis on how work is done within an organization"
(Davenport 1993). In their view processes have two important
characteristics: (i) They have customers (internal or external), (ii)
They cross organizational boundaries, i.e., they occur across or
x
between organizational subunits. One technique for identifying
Please Ask
business processes in an organization is the value chain method
( कृपया पूछ )

2 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
proposed by Porter and Millar (1985).
Processes are generally identified in terms of beginning and end
points, interfaces, and organization units involved, particularly the
customer unit. High Impact processes should have process owners.
Examples of processes include: developing a new product; ordering
goods from a supplier; creating a marketing plan; processing and
paying an insurance claim; etc.
Business process reengineering (often referred to by the
acronym BPR) is the main way in which organizations become
more efficient and modernize. Business process reengineering
transforms an organization in ways that directly affect
performance
Business process reengineering (BPR) is the analysis and redesign
of workflow within and between enterprises. BPR reached its heyday
in the early 1990's when   Michael Hammer   and   James
Champy   published their best-selling book, "Reengineering the
Corporation". The authors promoted the idea that sometimes radical
redesign and reorganization of an enterprise (wiping the slate clean)
was necessary to lower costs and increase quality of service and that
information technology was the key enabler for that radical
change. Hammer and Champy felt that the design of workflow in
most large corporations was based on assumptions about technology,
people, and organizational goals that were no longer valid. They
suggested seven principles of reengineering to streamline the work
process and thereby achieve significant levels of improvement in
quality, time management, and cost:
1. Organize around outcomes, not tasks.
2. Identify all the processes in an organization and prioritize them in
order of redesign urgency.
3. Integrate information processing work into the real work that
produces the information.
4. Treat geographically dispersed resources as though they were
centralized.
5. Link parallel activities in the workflow instead of just integrating
their results.
6. Put the decision point where the work is performed, and build
control into the process.
7. Capture information once and at the source.
Role of information technology
Information technology (IT) has historically played an important role
in the reengineering concept. It is considered by some as a major
enabler for new forms of working and collaborating within an
organization and across organizational borders.
The early BPR literature, e.g.   Hammer & Champy (1993),
identified several so called disruptive technologies that were x
supposed to challenge traditional wisdom about how work should be Please Ask
performed. ( कृपया पूछ )

3 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
1. Shared databases, making information available at many places
2. Expert systems, allowing generalists to perform specialist tasks
3. Telecommunication networks, allowing organizations to be
centralized and decentralized at the same time
4. Decision-support tools, allowing decision-making to be a part of
everybody's job
5. Wireless data communication and portable computers, allowing
field personnel to work office independent
6. Interactive videodisk, to get in immediate contact with potential
buyers
7. Automatic identification and tracking, allowing things to tell where
they are, instead of requiring to be found
8. High performance computing, allowing on-the-fly planning and
revisioning
In the mid 1990s, especially workflow management systems were
considered as a significant contributor to improved process efficiency.
Also   ERP (Enterprise Resource Planning)   vendors, such as SAP,
positioned their solutions as vehicles for business process redesign and
improvement.
Impact of BPR on organizational performance
The two cornerstones of any organization are the people and the
processes. If individuals are motivated and working hard, yet the
business processes are cumbersome and non-essential activities
remain, organizational performance will be poor. Business Process
Reengineering is the key to transforming how people work. What
appear to be minor changes in processes can have dramatic effects on
cash flow, service delivery and customer satisfaction. Even the act of
documenting business processes alone will typically improve
organizational efficiency by 10%.
Tips for Implementation of BPR project
The best way to map and improve the organization's procedures is to
take a top down approach, and not undertake a project in isolation.
That means:
• Starting with mission statements that define the purpose of the
organization and describe what sets it apart from others in its sector
or industry.
• Producing vision statements which define where the organization is
going, to provide a clear picture of the desired future position.
• Build these into a clear business strategy thereby deriving the
project objectives.
• Defining behaviours that will enable the organization to achieve its'
aims.
• Producing key performance measures to track progress.
x
• Relating efficiency improvements to the culture of the organization Please Ask
( कृपया पूछ )

4 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
• Identifying initiatives that will improve performance.
Once these building blocks in place, the BPR exercise can begin
Methodology
Although the labels and steps differ slightly, the early methodologies
that were rooted in IT-centric BPR solutions share many of the same
basic principles and elements. The following outline is one such
model, based on the PRLC (Process Reengineering Life Cycle)
approach. A more detailed description of this model is described here:

Simplified schematic outline of using a business process approach,


exemplified for pharmaceutical R&D:
1.         Envision new processes
1.         Secure management support
2.         Identify reengineering opportunities
3.         Identify enabling technologies
4.         Align with corporate strategy
 
2.         Initiating change
1.         Set up reengineering team
2.         Outline performance goals
3.         Process diagnosis
1.         Describe existing processes
2.         Uncover pathologies in existing processes
4.         Process redesign
1.         Develop alternative process scenarios
2.         Develop new process design
3.         Design HR architecture
4.         Select IT platform
5.         Develop overall blueprint and gather feedback
5.         Reconstruction
1.         Develop/install IT solution
2.         Establish process changes
6.         Process monitoring
1.                 Performance measurement, including time, quality,
cost, IT performance
2.         Link to continuous improvement
Loop-back to diagnosis
Benefiting from lessons learned from the early adopters, some BPR x
practitioners advocated a change in emphasis to a customer-centric, Please Ask
as opposed to an IT-centric, methodology. One such methodology, that ( कृपया पूछ )

5 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
also incorporated a Risk and Impact Assessment to account for the
impact that BPR can have on jobs and operations, was described by
Lon Roberts (1994). Roberts also stressed the use of change
management tools to proactively address resistance to change—a
factor linked to the demise of many reengineering initiatives that
looked good on the drawing board.
Also within the management consulting industry, a significant number
of methodological approaches have been developed
Tools to support BPR
When a BPR project is undertaken across the organization, it can
require managing a massive amount of information about the
processes, data and systems. If you don't have an excellent tool to
support BPR, the management of this information can become an
impossible task. The use of a good BPR/documentation tool is vital in
any BPR project.
The types of attributes you should look for in BPR software are:
• Graphical interface for fast documentation
• "Object oriented" technology, so that changes to data (eg: job titles)
only need to be made in one place, and the change automatically
appears throughout all the organization's procedures and
documentation.
• Drag and drop facility so you can easily relate organizational and
data objects to each step in the process
• Customizable meta data fields, so that you can include information
relating to your industry, business sector or organization in your
documentation
• Analysis, such as swim-lanes to show visually how responsibilities in
a process are transferred between different roles, or where data
items or computer applications are used.
• Support for Value Stream mapping.
• CRUD or RACI reports, to provide evidence for process improvement.
• The ability to assess the processes against agreed international
standards
• Simulation software to support 'what-if' analyses during the design
phase of the project to develop LEAN processes
• The production of word documents or web site versions of the
procedures at the touch of a single button, so that the information
can be easily maintained and updated.
The software we use by choice is Protos, a very comprehensive Dutch
system that has been translated into English. Protos meets all the
above requirements, and many more, and is better than any system
originated in English that we have seen.
Benefits to Business organization
x
BPR, if implemented properly, can give huge returns. BPR has helped
Please Ask
giants like Procter and Gamble Corporation and General Motors
( कृपया पूछ )

6 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
Corporation succeed after financial drawbacks due to competition. It
helped American Airlines somewhat get back on track from the bad
debt that is currently haunting their business practice. BPR is about
the proper method of implementation.
General Motors Corporation implemented a 3-year plan to consolidate
their multiple desktop systems into one. It is known internally as
"Consistent Office Environment" (Booker). This reengineering process
involved replacing the numerous brands of desktop systems, network
operating systems and application development tools into a more
manageable number of vendors and technology platforms. According
to Donald G. Hedeen, director of desktops and deployment at GM and
manager of the upgrade program, he says that the process "lays the
foundation for the implementation of a common business
communication strategy across General Motors." (Booker). Lotus
Development Corporation and Hewlett-Packard Development
Company, formerly Compaq Computer Corporation, received the single
largest non-government sales ever from General Motors Corporation.
GM also planned to use Novell NetWare as a security client, Microsoft
Office and Hewlett-Packard printers. According to Donald G. Hedeen,
this saved GM 10% to 25% on support costs, 3% to 5% on hardware, 40%
to 60% on software licensing fees, and increased efficiency by
overcoming incompatibility issues by using just one platform across
the entire company.
Southwest Airlines offers another successful example of reengineering
their company and using Information Technology the way it was meant
to be implemented. In 1992, Southwest Airlines had a revenue of $1.7
billion and an after-tax profit of $91 million. American Airlines, the
largest U.S. carrier, on the other hand had a revenue of $14.4 billion
dollars but lost $475 million and has not made a profit since 1989
(Furey and Diorio, 1994). Companies like Southwest Airlines know
that their formula for success is easy to copy by new start-ups like
Morris, Reno, and Kiwi Airlines. In order to stay in the game of
competitive advantage, they have to continuously reengineer their
strategy. BPR helps them be original.
Michael Dell is the founder and CEO of DELL Incorporated, which has
been in business since 1983 and has been the world's fastest growing
major PC Company. Michael Dell's idea of a successful business is to
keep the smallest inventory possible by having a direct link with the
manufacturer. When a customer places an order, the custom parts
requested by the customer are automatically sent to the
manufacturer for shipment. This reduces the cost for inventory
tracking and massive warehouse maintenance. Dell's website is noted
for bringing in nearly "$10 million each day in sales."(Smith, 1999).
Michael Dell mentions: "If you have a good strategy with sound
economics, the real challenge is to get people excited about what
you're doing. A lot of businesses get off track because they don't
communicate an excitement about being part of a winning team that
can achieve big goals. If a company can't motivate its people and it
doesn't have a clear compass, it will drift." (Smith, 1999) Dell's stocks x
have been ranked as the top stock for the decade of the 1990s, when Please Ask
it had a return of 57,282% (Knestout and Ramage, 1999). Michael ( कृपया पूछ )

7 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
Dell is now concentrating more on customer service than selling
computers since the PC market price has pretty much equalized.
Michael Dell notes: "The new frontier in our industry is service, which
is a much greater differentiator when price has been equalized. In our
industry, there's been a pretty huge gap between what customers
want in service and what they can get, so they've come to expect
mediocre service. We may be the best in this area, but we can still
improve quite a bit—in the quality of the product, the availability of
parts, service and delivery time." (Smith, 1999) Michael Dell
understands the concept of BPR and really recognizes where and when
to reengineer his business.
Ford reengineered their business and manufacturing process from just
manufacturing cars to manufacturing quality cars, where the number
one goal is quality. This helped Ford save millions on recalls and
warranty repairs. Ford has accomplished this goal by incorporating
barcodes on all their parts and scanners to scan for any missing parts
in a completed car coming off of the assembly line. This helped them
guarantee a safe and quality car. They have also implemented Voice-
over-IP (VoIP) to reduce the cost of having meetings between the
branches.
A multi-billion dollar corporation like Procter and Gamble
Corporation, which carries 300 brands and growing really has a strong
grasp in re-engineering. Procter and Gamble Corporation's chief
technology officer, G. Gil Cloyd, explains how a company which carry
multiple brands has to contend with the "classic innovator's dilemma —
most innovations fail, but companies that don't innovate die. His
solution, innovating innovation..." (Teresko, 2004). Cloyd has helped
a company like Procter and Gamble grow to $5.1 billion by the fiscal
year of 2004. According to Cloyd's scorecard, he was able to raise the
volume by 17%, the organic volume by 10%, sales are at $51.4 billion
up by 19%, with organic sales up 8%, earnings are at $6.5 billion up
25% and share earnings up 25%. Procter and Gamble also has a free
cash flow of $7.3 billion or 113% of earnings, dividends up 13%
annually with a total shareholder return of 24%. Cloyd states: "The
challenge we face is the competitive need for a very rapid pace of
innovation. In the consumer products world, we estimate that the
required pace of innovation has double in the last three years. Digital
technology is very important in helping us to learn faster." (Teresko,
2004) G. Gil Cloyd also predicts, in the near future, "as much as 90% of
P&G's R&D will be done in a virtual world with the remainder being
physical validation of results and options." (Teresko, 2004).
Management Information Systems (MIS)
A management information system (MIS) is a system or process that
provides the information necessary to manage an organization
effectively. MIS and the information it generates are generally
considered essential components of  prudent and reasonable business
decisions.
The importance of maintaining a consistent approach to the x
development, use, and review of MIS systems within the institution Please Ask
must be an ongoing concern of both bank management and OCC ( कृपया पूछ )

8 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
examiners. MIS should have a clearly defined framework of guidelines,
policies or practices, standards, and procedures for the organization.
These should be followed throughout the institution in the
development, maintenance, and use of all MIS.
MIS is viewed and used at many levels by management. It should be
supportive of the institution's longer term strategic goals and
objectives. To the other extreme it is also those everyday financial
accounting systems that are used to ensure basic control is maintained
over financial recordkeeping activities.
 
Financial accounting systems and subsystems are just one type of
institutional
MIS. Financial accounting systems are an important functional element
or part of the total MIS structure. However, they are more narrowly
focused on the internal balancing of an institution's books to the
general ledger and other financial accounting subsystems. For
example, accrual adjustments, reconciling and correcting entries used
to reconcile the financial systems to the general ledger are not always
immediately entered into other MIS systems.
Accordingly, although MIS and accounting reconcilement totals for
related listings and activities should be similar, they may not
necessarily balance. An institution's MIS should be designed to achieve
the following goals:
a) Enhance communication among employees.
b) Deliver complex material throughout the institution.
c) Provide an objective system for recording and aggregating
information.
d) Reduce expenses related to labor-intensive manual
activities.
e) Support the organization's strategic goals and direction.
Because MIS supplies decision makers with facts, it supports and
enhances the overall decision making process. MIS also enhances job
performance throughout an institution. At the most senior levels, it
provides the data and information to help the board and management
make strategic decisions. At other levels, MIS provides the means
through which the institution's activities are monitored and
information is distributed to management, employees, and customers.
Effective MIS should ensure the appropriate presentation formats and
time frames required by operations and senior management are met.
MIS can be maintained and developed by either manual or automated
systems or a combination of both. It should always be sufficient to
meet an institution's unique business goals and objectives. The
effective deliveries of an institution's products and services are
supported by the MIS. These systems should be accessible and useable
at all appropriate levels of the organization. x
MIS is a critical component of the institution's overall risk management Please Ask
strategy. MIS supports management's ability to perform such reviews. ( कृपया पूछ )

9 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
MIS should be used to recognize, monitor, measure, limit, and manage
risks. Risk management involves four main elements:
a) Policies or practices.
b) Operational processes.
c) Staff and management.
d) Feedback devices.
Frequently, operational processes and feedback devices are
intertwined and cannot easily be viewed separately. The most
efficient and useable MIS should be both operational and
informational. As such, management can use MIS to measure
performance, manage resources, and help an institution comply with
regulatory requirements. One example of this would be the managing
and reporting of loans to insiders. MIS can also be used by
management to provide feedback on the effectiveness of risk controls.
Controls are developed to support the proper management of risk
through the institution's policies or practices, operational processes,
and the assignment of duties and responsibilities to staff and
managers.
Definition   :   ''Management Information Systems (MIS) is a
general name for the academic discipline covering the
application of people, technologies, and procedures —
collectively called information systems — to solve business
problems. MIS are distinct from regular information systems in
that they are used to analyze other information systems
applied in operational activities in the organization.
Academically, the term is commonly used to refer to the group
of information management methods tied to the automation or
support of human decision making, e.g. Decision Support
Systems, Expert systems, and Executive information systems.
It includes manual and automated systems designed to provide
management with timely and relevant information that is necessary to
successfully manage the business or department.
Risks Associated With MIS
Risk reflects the potential, the likelihood, or the expectation of
events that could adversely affect earnings or capital. Management
uses MIS to help in the assessment of risk within an institution.
Management decisions based upon ineffective, inaccurate, or
incomplete MIS may increase risk in a number of areas such as credit
quality, liquidity, market/pricing, interest rate, or foreign currency. A
flawed MIS causes operational risks and can adversely affect an
organization's monitoring of its fiduciary, consumer, fair lending, Bank
Secrecy Act, or other compliance-related activities.
Since management requires information to assess and monitor
performance at all levels of the organization, MIS risk can extend to
all levels of the operations. Additionally, poorly programmed or non- x
secure systems in which data can be manipulated and/or systems
Please Ask
requiring ongoing repairs can easily disrupt routine work flow and can ( कृपया पूछ )

10 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
lead to incorrect decisions or impaired planning.
Assessing Vulnerability To MIS Risk
To function effectively as an interacting, interrelated, and
interdependent feedback tool for management and staff, MIS must be
"useable." The five elements of a useable MIS system are: timeliness,
accuracy, consistency, completeness, and relevance. The usefulness of
MIS is hindered whenever one or more of these elements is
compromised.

Timeliness
To simplify prompt decision making, an institution's MIS should be
capable of providing and distributing   current information to
appropriate users. Information systems should be designed to expedite
reporting of information. The system should be able to quickly collect
and edit data, summarize results, and be able to adjust and correct
errors promptly.
 
Accuracy
A sound system of automated and manual internal controls must exist
throughout all information systems processing activities. Information
should receive appropriate editing, balancing, and internal control
checks. A comprehensive internal and external audit program should
be employed to ensure the adequacy of internal controls.
 
Consistency
To be reliable, data should be processed and compiled consistently
and uniformly. Variations in how data is collected and reported can
distort information and trend analysis. In addition, because data
collection and reporting processes will change over time, management
must establish sound procedures to allow for systems changes. These
procedures should be well defined and documented, clearly
communicated to appropriate employees, and should include an
effective monitoring system.
 
Completeness
Decision makers need complete and pertinent information in a
summarized form. Reports should be designed to eliminate clutter and
voluminous detail, thereby avoiding "information overload."
 
Relevance
Information provided to management must be relevant. Information
that is inappropriate, unnecessary, or too detailed for effective
decision making has no value. MIS must be appropriate to support the x
management level using it. The relevance and level of detail provided
Please Ask
through MIS systems directly correlate to what is needed by the board ( कृपया पूछ )

11 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
of directors, executive management, departmental or area mid-level
managers, etc. in the performance of their jobs.
 
Achieving Sound MIS
The development of sound MIS is the result of the development and
enforcement of a culture of system ownership. An "owner" is a system
user who knows current customer and constituent needs and also has
budget authority to fund new projects. Building "ownership" promotes
pride in institution processes and helps ensure accountability.
Although MIS does not necessarily reduce expenses, the development
of meaningful systems, and their proper use, will lessen the
probability that erroneous decisions will be made because of
inaccurate or untimely information. Erroneous decisions invariably
misallocate and/or waste resources. This may result in an adverse
impact on earnings and/or capital. MIS which meets the five elements
of useability is a critical ingredient to an institution's short- and long-
range planning efforts. To achieve sound MIS, the organization's
planning process should include consideration of MIS needs at both the
tactical and strategic levels. For example, at a tactical level MIS
systems and report output should support the annual operating plan
and budgetary processes. They should also be used in support of the
long term strategic MIS and business planning initiatives. Without the
development of an effective MIS, it is more difficult for management
to measure and monitor the success of new initiatives and the
progress of ongoing projects. Two common examples of this would be
the management of mergers and acquisitions or the continuing
development and the introduction of new products and services.
Management needs to ensure that MIS systems are developed
according to a sound methodology that encompasses the following
phases:
a) Appropriate analysis of system alternatives, approval points
as the system is developed or acquired, and task organization.
b) Program development and negotiation of contracts with
equipment and software vendors.
c) Development of user instructions, training, and testing of
the system.
d) Installation and maintenance of the system.
Management should also consider use of "project management
techniques" to monitor progress as the MIS system is being developed.
Internal controls must be woven into the processes and periodically
reviewed by auditors.
Management also should ensure that managers and staff receive initial
and ongoing training in MIS. In addition, user manuals should be
available and provide the following information:
i. A brief description of the application or system. x
ii. Input instructions, including collection points and Please Ask
times to send updated information. ( कृपया पूछ )

12 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
iii. Balancing and reconciliation procedures.
iv. A complete listing of output reports, including samples.
Depending on the size and complexity of its MIS system, an institution
may need to use different manuals for different users such as first-
level users, unit managers, and programmers.

MIS Reviews
By its very nature, management information is designed to meet the
unique needs of individual institutions. As a result, MIS requirements
will vary depending on the size and complexity of the operations. For
example, systems suitable for community sized institutions will not
necessarily be adequate for larger institutions. However, basic
information needs or requirements are similar in all financial
institutions regardless of size. The complexity of the operations
and/or activities, together with institution size, point to the need for
MIS of varying degrees of complexity to support the decision-making
processes. Examiners should base MIS reviews on an evaluation of
whether the system(s) provide management and directors with the
information necessary to guide operations, support timely decision
making, and help management monitor progress toward reaching
institutional goals and objectives. Although examiners should
 encourage management to develop sound information systems, they
also should be reasonable in their expectations about what constitutes
suitable MIS.
Examiner MIS reviews are normally focused on a specific area of
activity, on a clearly identifiable departmental or functional basis, or
as a part of the activity being examined within a larger department.
During the examination, the MIS review should occur at both a macro
(big picture) level and also at the micro (functional/product oriented
view of the business) level. The examiner-in-charge of the MIS-review
program should look at the useability and effectiveness of the
corporate-wide MIS structure.
The examiner should also collect MIS related observations and
information from the examiners-in-charge of the other areas under
review. It would be very difficult for one examiner to attempt to
perform a detailed MIS review for all of an organization's functional
and operational areas of activity. It is practical and reasonable,
however, to have this lead examiner coordinate and consolidate the
MIS reviews from the other examination areas. The MIS related
feedback received from other area examiners provides important and
practical input to the MIS review examiner. The consolidation,
coordination, and analysis of this MIS feedback can be used to reach
supportable macrolevel conclusions and recommendations for
corporate-wide MIS activities. MIS reviews in the functional or product
review areas generally should be performed by an examiner who is
considered to be a subject matter expert (SME) in the area of
activities or operations that are being supported by the MIS systems or
processes under review. The SME must have a thorough and complete x
understanding of the baseline "business" supported by the MIS
Please Ask
system(s) under review. A solid understanding of the business is ( कृपया पूछ )

13 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
fundamental to the completion of a meaningful MIS review. The
decision regarding the overall quality and effectiveness of MIS
generally should be made by the SME for the area under review. The
SME for each area where MIS is under review must subsequently
communicate MIS related findings, conclusions, and opinions to the
examiner charged with the responsibility for the complete MIS review
work program at that examination. This is clearly a collaborative
effort among area SMEs and the examiner charged with the
responsibility for this area of review.
The examiner coordinating the overall MIS review program should be a
commercial examiner with broad experience and understanding which
covers many areas of organizational operations and activity.
Alternatively, a bank information systems (BIS) examiner could serve
in this capacity. BIS examiners should be consulted whenever there
are questions, issues, or concerns surrounding the use of information
systems (IS) or electronic data processing (EDP) technology or the
effectiveness of MIS-related internal controls in any automated area
of the organization's activities.
When performing MIS reviews, examiners should use the guidelines in
this booklet to determine if management has:
i. Identified the institution's specific
information requirements— Examiners can focus on
specific information needs related to issues such as asset
quality, interest rate risk, regulatory reporting, and
compliance. If possible, the MIS review should be concurrent
with examinations of the commercial, consumer, fiduciary, and
BIS activities. This would enhance interaction and
communication among examiners.
ii. Established effective reporting mechanisms
to guide decisions—   This process includes reviewing
controls that ensure that information is reliable, timely,
accurate, and confidential.
Decision Support System
Decision Support Systems (DSS) are a specific class of computerized
information system that supports business and organizational decision-
making activities. A properly designed DSS is an interactive software-
based system intended to help decision makers compile useful
information from raw data, documents, personal knowledge, and/or
business models to identify and solve problems and make decisions.
1. Typical information that a decision support application
might gather and present would be.
2. Accessing all of your current information assets, including
legacy and relational data sources, cubes, data warehouses,
and data marts
3. Comparative sales figures between one week and the next
4. Projected revenue figures based on new product sales x
assumptions Please Ask
( कृपया पूछ )

14 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
5. The consequences of different decision alternatives, given
past experience in a context that is described
Definition: DSS, refers to an interactive computerized
system that gathers and presents data from a wide
range of sources, typically for business purposes. DSS
applications are systems and subsystems that help
people make decisions based on data that is culled from
a wide range of sources
For example: a national on-line book seller wants to begin selling
its products internationally but first needs to determine if that will be
a wise business decision. The vendor can use a DSS to gather
information from its own resources (using a tool such as OLAP) to
determine if the company has the ability or potential ability to
expand its business and also from external resources, such as industry
data, to determine if there is indeed a demand to meet. The DSS will
collect and analyze the data and then present it in a way that can be
interpreted by humans. Some decision support systems come very
close to acting as artificial intelligence agents.
DSS applications are not single information resources, such as a
database or a program that graphically represents sales figures, but
the combination of integrated resources working together.
Information Builders like WebFOCUS reporting software is ideally
suited for building decision support systems due to its wide reach of
data, interactive facilities, ad hoc reporting capabilities, quick
development times, and simple Web-based deployment.

 
The best decision support systems include high-level summary reports
or charts and allow the user to drill down for more detailed
information.
Decision support system (DSS) can be defined as a computer program
application that analyzes business data and presents it so that users
can make business decisions more easily. It is an "informational
application" (to distinguish it from an "operational application" that
collects the data in the course of normal business operation).Typical
information that a decision support application might gather and
present would be:
a) Comparative sales figures between one week and the next
b) Projected revenue figures based on new product sales
assumptions
c) The consequences of different decision alternatives, given
past experience in a context that is described
A decision support system may present information graphically and
may include an expert system or artificial intelligence (AI). It may be
aimed at business executives or some other group of knowledge x
workers Please Ask
( कृपया पूछ )

15 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
Taxonomies of DSS
As with the definition, there is no universally accepted taxonomy of
DSS either. Different authors propose different classifications. Using
the relationship with the user as the
criterion,   Haettenschwiler   differentiates passive, active, and
cooperative DSS. A passive DSS is a system that aids the process of
decision making, but that cannot bring out explicit decision
suggestions or solutions. An active DSS can bring out such decision
suggestions or solutions. A cooperative DSS allows the decision maker
(or its advisor) to modify, complete, or refine the decision suggestions
provided by the system, before sending them back to the system for
validation. The system again improves, completes, and refines the
suggestions of the decision maker and sends them back to her for
validation. The whole process then starts again, until a consolidated
solution is generated.
Using the mode of assistance as the criterion, Power differentiates
communication-driven DSS, data-driven DSS, document-driven DSS,
knowledge-driven DSS, and model-driven DSS.
Model-driven DSS :    A model-driven DSS emphasizes access to
and manipulation of a statistical, financial, optimization, or
simulation model. Model-driven DSS use data and parameters provided
by users to assist decision makers in analyzing a situation; they are
not necessarily data intensive. Dicodess is an example of an open
source model-driven DSS generator. Early versions of model-driven DSS
were called model-oriented DSS by Alter (1980), computationally
oriented DSS by Bonczek, Holsapple and Whinston (1981) and later
spreadsheet-oriented and solver-oriented DSS by Holsapple and
Whinston (1996).
The first commercial tool for building model-driven DSS using financial
and quantitative models was called IFPS, an acronym for interactive
financial planning system. It was developed in the late 1970's by
Gerald R. Wagner and his students at the University of Texas. Wagner’s
company, EXECUCOM Systems, marketed IFPS until the mid 1990s.
Gray’s Guide to IFPS (1983) promoted the use of the system in
business schools. Another DSS generator for building specific systems
based upon the Analytic Hierarchy Process (Saaty, 1982), called Expert
Choice, was released in 1983. Expert Choice supports personal or
group decision making. Ernest Forman worked closely with Thomas
Saaty to design Expert Choice.
In 1978, Dan Bricklin and Bob Frankston co-invented the software
program VisiCalc (Visible Calculator). VisiCalc provided managers the
opportunity for hands-on computer-based analysis and decision
support at a reasonably low cost.   VisiCalc was the first "killer"
application for personal computers and made possible development of
many model-oriented, personal DSS for use by managers. The history
of microcomputer spreadsheets is described in Power (2000). In 1987,
Frontline Systems founded by Dan Fylstra marketed the first
optimization solver add-in for Microsoft Excel. x
Communication -driven DSS:          Communications-driven DSS Please Ask
( कृपया पूछ )

16 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
use network and communications technologies to facilitate decision-
relevant collaboration and communication. In these systems,
communication technologies are the dominant architectural
component. Tools used include groupware, video conferencing and
computer-based bulletin boards (Power, 2002).
In the early 1980s, academic researchers developed a new category of
software to support group decision-making called Group Decision
Support Systems abbreviated GDSS   (cf., Gray, 1981; Huber, 1982;
Turoff and Hiltz, 1982). Mindsight from Execucom Systems,
GroupSystems developed at the University of Arizona and the SAMM
system developed by University of Minnesota researchers were early
Group DSS. Eventually GroupSystems matured into a commercial
product.
Generally, groupware, bulletin boards, audio and videoconferencing
are the primary technologies for communications-driven decision
support. In the past few years, voice and video delivered using the
Internet protocol have greatly expanded the possibilities for
synchronous communications-driven DSS.
Data-driven DSS:      a data-driven DSS emphasizes access to and
manipulation of a time-series of internal company data and sometimes
external and real-time data. Simple file systems accessed by query
and retrieval tools provide the most elementary level of functionality.
Data warehouse systems that allow the manipulation of data by
computerized tools tailored to a specific task and setting or by more
general tools and operators provide additional functionality. Data-
Driven DSS with On-line Analytical Processing (cf., Codd et al., 1993)
provide the highest level of functionality and decision support that is
linked to analysis of large collections of historical data. Executive
Information Systems are examples of data-driven DSS (Power, 2002).
Initial examples of these systems were called data-oriented DSS,
Analysis Information Systems (Alter, 1980) and retrieval-only DSS by
Bonczek, Holsapple and Whinston (1981).
One of the first data-driven DSS was built using an APL-based software
package called AAIMS, An Analytical Information Management System.
It was developed from 1970-1974 by Richard Klaas and Charles Weiss
at American Airlines (cf. Alter, 1980).
Document-driven DSS:   A document-driven DSS uses computer
storage and processing technologies to provide document retrieval and
analysis. Large document databases may include scanned documents,
hypertext documents, images, sounds and video. Examples of
documents that might be accessed by a document-driven DSS are
policies and procedures, product specifications, catalogs, and
corporate historical documents, including minutes of meetings and
correspondence. A search engine is a primary decision-aiding tool
associated with a document-driven DSS (Power, 2002). These systems
have also been called text-oriented DSS (Holsapple and
Whinston,1996).
Text and document management emerged in the 1970s and 1980s as x
an important, widely used computerized means for representing and Please Ask
processing pieces of text (Holsapple and Whinston, 1996). The first ( कृपया पूछ )

17 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
scholarly article for this category of DSS was written by Swanson and
Culnan (1978). They reviewed document-based systems for
management planning and control. Until the mid-1990s little progress
was made in helping managers find documents to support their
decision making. Fedorowicz (1993, 1996) helped define the need for
such systems. She estimated in her 1996 article that only 5 to 10
percent of stored business documents are available to managers for
use in decision making. The World-wide web technologies significantly
increased the availability of documents and facilitated the
development of document-driven DSS..
Knowledge-driven DSS:       Knowledge-driven DSS can suggest
or recommend actions to managers. These DSS are person-computer
systems with specialized problem-solving expertise. The "expertise"
consists of knowledge about a particular domain, understanding of
problems within that domain, and "skill" at solving some of these
problems (Power, 2002). These systems have been called suggestion
DSS (Alter, 1980) and knowledge-based DSS (Klein & Methlie, 1995).
Goul, Henderson, and Tonge (1992) examined Artificial Intelligence
(AI)  contributions to DSS.
In 1965, a Stanford University research team led by Edward
Feigenbaum created the DENDRAL expert system. DENDRAL led to the
development of other rule-based reasoning programs including MYCIN,
which helped physicians diagnose blood diseases based on sets of
clinical symptoms. The MYCIN project resulted in development of the
first expert-system shell (Buchanan and Shortliffe, 1984).
Bonczek, Holsapple and Whinston’s (1981) book created interest in
using these technologies for DSS. In 1983, Dustin Huntington
established EXSYS. That company and product made it practical to use
PC based tools to develop expert systems. By 1992, some 11 shell
programs were available for the MacIntosh platform, 29 for IBM-DOS
platforms, 4 for Unix platforms, and 12 for dedicated mainframe
applications (National Research Council, 1999). Artificial Intelligence
systems have been developed to detect fraud and expedite financial
transactions, many additional medical diagnostic systems have been
based on AI, expert systems have been used for scheduling in
manufacturing operation and web-based advisory systems. In recent
years, connecting expert systems technologies to relational databases
with web-based front ends has broadened the deployment and use of
knowledge-driven DSS.
Web-based DSS: Power defined a Web-based decision support
system as a computerized system that delivers decision support
information or decision support tools to a manager or business analyst
using a "thin-client" Web browser like Netscape Navigator or Internet
Explorer. The computer server that is hosting the DSS application is
linked to the user's computer by a network with the TCP/IP protocol.
Beginning in approximately 1995, the World-wide Web and global
Internet provided a technology platform for further extending the
capabilities and deployment of computerized decision support. The x
release of the HTML 2.0 specifications with form tags and tables was a Please Ask
turning point in the development of web-based DSS. In 1995, a ( कृपया पूछ )

18 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
number of papers were presented on using the Web and Internet for
decision support at the 3rd International Conference of the
International Society for Decision Support Systems (ISDSS). In addition
to Web-based, model-driven DSS, researchers were reporting Web
access to data warehouses. DSS Research Resources was started as a
web-based collection of bookmarks. By 1995, the World-Wide Web
(Berners-Lee, 1996) was recognized by a number of software
developers and academics as a serious platform for implementing all
types of Decision Support Systems (cf., Bhargava & Power, 2001).
In November 1995, Power,   Bhargava   and   Quek   submitted the
Decision Support Systems Research page for inclusion inISWorld. The
goal was to provide a useful starting point for accessing Web-based
material related to the design, development, evaluation, and
implementation of Decision Support Systems.
In 1996-97, corporate intranets were developed to support
information exchange and knowledge management. The primary
decision support tools included ad hoc query and reporting tools,
optimization and simulation models, online analytical processing
(OLAP), data mining and data visualization. Enterprise-wide DSS using
database technologies were especially popular in Fortune 2000
companies.   Bhargava, Krishnan   and   Müller   (1997) continued to
discuss and experiment with electronic markets for decision
technologies.
In 1999, vendors introduced new Web-based analytical applications.
Many DBMS vendors shifted their focus to Web-based analytical
applications and business intelligence solutions. In 2000, application
service providers (ASPs) began hosting the application software and
technical infrastructure for decision support capabilities. 2000 was
also the year of the portal. More sophisticated "enterprise knowledge
portals" were introduced by vendors that combined information
portals, knowledge management, business intelligence, and
communications-driven DSS in an integrated Web environment.
History of Decision Support Systems
Computerized decision support systems became practical with the
development of minicomputers, timeshare operating systems and
distributed computing. The history of the implementation of such
systems begins in the mid-1960s. In a technology field as diverse as
DSS, chronicling history is neither neat nor linear. Different people
perceive the field of Decision Support Systems from various vantage
points and report different accounts of what happened and what was
important (cf., Arnott & Pervan, 2005; Eom & Lee, 1990b; McCosh &
Correa-Perez, 2006; Power, 2003; Power, 2004a; Silver, 1991). As
technology evolved new computerized decision support applications
were developed and studied. Researchers used multiple frameworks to
help build and understand these systems. Today one can organize the
history of DSS into the five broad DSS categories explained in Power
(2001; 2002; 2004b), including: communications-driven, data-driven,
document driven, knowledge-driven and model-driven decision x
support systems. Please Ask
( कृपया पूछ )

19 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
This hypertext document is a starting point in explaining the origins of
the various technology threads that are converging to provide
integrated support for managers working alone, in teams and in
organization hierarchies to manage organizations and make more
rational decisions. History is both a guide to future activity in this
field and a record of the ideas and actions of those who have helped
advance our thinking and practice. Historical facts can be sorted out
and better understood, but more information gathering is necessary.
This web page is a starting point in collecting more first hand accounts
and in building a more complete mosaic of what was occurring in
universities, software companies and in organizations to build and use
DSS.
This document traces decision support applications and research
studies related to model and data-oriented systems, management
expert systems, multidimensional data analysis, query and reporting
tools, online analytical processing (OLAP), Business Intelligence, group
DSS, conferencing and groupware, document management, spatial DSS
and Executive Information Systems as the technologies emerge,
converge and diverge. All of these technologies have been used to
support decision making. A timeline of major historical milestones
relevant to DSS is included in Appendix I.
The study of decision support systems is an applied discipline that
uses knowledge and especially theory from other disciplines. For this
reason, many DSS research questions have been examined because
they were of concern to people who were building and using specific
DSS. Hence much of the broad DSS knowledge base provides
generalizations and directions for building more effective DSS (cf.,
Baskerville & Myers, 2002; Keen, 1980).
The next section describes the origins of the field of decision support
systems. Section 3 discusses the decision support systems theory
development that occurred in the late 1970s and early 1980s. Section
4 discusses important developments to communications-driven , data-
driven, document driven, knowledge-driven and model-driven DSS
(cf., Power, 2002). The final section briefly discusses how DSS
practice, research and technology is continuing to evolve.
Origin of Decision Support Systems
In the 1960s, researchers began systematically studying the use of
computerized quantitative models to assist in decision making and
planning (Raymond, 1966; Turban, 1967; Urban, 1967, Holt and Huber,
1969). Ferguson and Jones (1969) reported the first experimental
study using a computer aided decision system. They investigated a
production scheduling application running on an IBM 7094. In
retrospect, a major historical turning point was Michael S. Scott
Morton's (1967) dissertation field research at Harvard University.
Scott Morton’s study involved building, implementing and then testing
an interactive, model-driven management decision system. Fellow
Harvard Ph.D. student Andrew McCosh asserts that the “concept of
decision support systems was first articulated by Scott Morton in x
February 1964 in a basement office in Sherman Hall, Harvard Business Please Ask
School” (McCosh email, 2002) in a discussion they had about Scott ( कृपया पूछ )

20 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
Morton’s dissertation. During 1966, Scott Morton (1971) studied how
computers and analytical models could help managers make a
recurring key business planning decision. He conducted an experiment
in which managers actually used a Management Decision System
(MDS). Marketing and production managers used an MDS to coordinate
production planning for laundry equipment. The MDS ran on an IDI 21
inch CRT with a light pen connected using a 2400 bps modem to a pair
of Univac 494 systems.
The pioneering work of George Dantzig, Douglas Engelbart and Jay
Forrester likely influenced the feasibility of building computerized
decision support systems. In 1952, Dantzig became a research
mathematician at the Rand Corporation, where he began
implementing linear programming on its experimental computers. In
the mid-1960s, Engelbart and colleagues developed the first
hypermedia—groupware system called NLS (oNLine System). NLS
facilitated the creation of digital libraries and the storage and
retrieval of electronic documents using hypertext. NLS also provided
for on-screen video teleconferencing and was a forerunner to group
decision support systems. Forrester was involved in building the SAGE
(Semi-Automatic Ground Environment) air defense system for North
America completed in 1962. SAGE is probably the first computerized
data-driven DSS. Also, Professor Forrester started the System
Dynamics Group at the Massachusetts Institute of Technology Sloan
School. His work on corporate modeling led to programming DYNAMO,
a general simulation compiler.
In 1960, J.C.R. Licklider published his ideas about the future role of
multiaccess interactive computing in a paper titled “Man-Computer
Symbiosis.” He saw man-computer interaction as enhancing both the
quality and efficiency of human problem solving and his paper
provided a guide for decades of computer research to follow. Licklider
was the architect of Project MAC at MIT that furthered the study of
interactive computing.
By April 1964, the development of the IBM System 360 and other more
powerful mainframe systems made it practical and cost-effective to
develop Management Information Systems (MIS) for large companies
(cf., Davis, 1974). These early MIS focused on providing managers with
structured, periodic reports and the information was primarily from
accounting and transaction processing systems, but the systems did
not provide interactive support to assist managers in decision making.
Around 1970 business journals started to publish articles on
management decision systems, strategic planning systems and
decision support systems (cf.,   Sprague and Watson 1979).. For
example, Scott Morton and colleagues McCosh and Stephens published
decision support related articles in 1968. The first use of the term
decision support system was in Gorry and Scott-Morton’s (1971) Sloan
Management Review article. They argued that Management
Information Systems primarily focused on structured decisions and
suggested that the supporting information systems for semi-structured
x
and unstructured decisions should be termed “Decision Support
Systems”. Please Ask
( कृपया पूछ )

21 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
T.P. Gerrity, Jr. focused on Decision Support Systems design issues in
his 1971 Sloan Management Review article titled "The Design of Man-
Machine Decision Systems: An Application to Portfolio Management".
The article was based on his MIT Ph.D. dissertation. His system was
designed to support investment managers in their daily administration
of a clients' stock portfolio.
John D.C. Little, also at Massachusetts Institute of Technology, was
studying DSS for marketing. Little and Lodish (1969) reported research
on MEDIAC, a media planning support system. Also, Little (1970)
identified criteria for designing models and systems to support
management decision-making. His four criteria included: robustness,
ease of control, simplicity, and completeness of relevant detail. All
four criteria remain relevant in evaluating modern Decision Support
Systems. By 1975, Little was expanding the frontiers of computer-
supported modeling. His DSS called Brandaid was designed to support
product, promotion, pricing and advertising decisions. Little also
helped develop the financial and marketing modeling language known
as EXPRESS.
In 1974, Gordon Davis, a Professor at the University of Minnesota,
published his influential text on Management Information Systems. He
defined a Management Information System as "an integrated,
man/machine system for providing information to support the
operations, management, and decision-making functions in an
organization. (p. 5)." Davis's Chapter 12 was titled "Information System
Support for Decision Making" and Chapter 13 was titled "Information
System Support for Planning and Control". Davis’s framework
incorporated computerized decision support systems into the
emerging field of management information systems.
Peter Keen and Charles Stabell claim the concept of decision support
systems evolved from "the theoretical studies of organizational
decisionmaking done at the Carnegie Institute of Technology during
the late 1950s and early '60s and the technical work on interactive
computer systems, mainly carried out at the Massachusetts Institute
of Technology in the 1960s. (Keen and Scott Morton, 1978)". Herbert
Simon’s books (1947, 1960) and articles provide a context for
understanding and supporting decision making.
In 1995, Hans Klein and Leif Methlie noted “A study of the origin of
DSS has still to be written. It seems that the first DSS papers were
published by PhD students or professors in business schools, who had
access to the first time-sharing computer system: Project MAC at the
Sloan School, the Dartmouth Time Sharing Systems at the Tuck School.
In France, HEC was the first French business school to have a time-
sharing system (installed in 1967), and the first DSS papers were
published by professors of the School in 1970. (p. 112).”
Theory Development
In the mid- to late 1970s, both practice and theory issues related to
DSS were discussed at academic conferences including the American
Institute for Decision Sciences meetings and the ACM SIGBDP x
Conference on Decision Support Systems in San Jose, CA in January Please Ask
1977 (the proceeding were included in the journal Database). The first ( कृपया पूछ )

22 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
International Conference on Decision Support Systems was held in
Atlanta, Georgia in 1981. Academic conferences provided forums for
idea sharing, theory discussions and information exchange.
At about this same time, Keen and Scott Morton’s DSS textbook (1978)
provided the first broad behavioral orientation to decision support
system analysis, design, implementation, evaluation and
development. This influential text provided a framework for teaching
DSS in business schools. McCosh and Scott-Morton’s (1978) DSS book
was more influential in Europe.
In 1980, Steven Alter published his MIT doctoral dissertation results in
an influential book. Alter's research and papers (1975; 1977) expanded
the framework for thinking about business and management DSS. Also,
his case studies provided a firm descriptive foundation of decision
support system examples. A number of other MIT dissertations
completed in the late 1970s also dealt with issues related to using
models for decision support.
Alter concluded from his research (1980) that decision support
systems could be categorized in terms of the generic operations that
can be performed by such systems. These generic operations extend
along a single dimension, ranging from extremely data-oriented to
extremely model-oriented. Alter conducted a field study of 56 DSS
that he categorized into seven distinct types of DSS. His seven types
include:
a) File drawer systems that provide access to data items.
b) Data analysis systems that support the manipulation of data
by computerized tools tailored to a specific task and setting or
by more general tools and operators.
c) Analysis information systems that provide access to a series
of decision-oriented databases and small models.
d) Accounting and financial models that calculate the
consequences of possible actions.
e) Representational models that estimate the consequences of
actions on the basis of simulation models.
f) Optimization models that provide guidelines for action by
generating an optimal solution consistent with a series of
constraints.
g) Suggestion models that perform the logical processing
leading to a specific suggested decision for a fairly structured
or well-understood task.
Donovan and Madnick (1977) classified DSS as institutional or ad hoc.
Institutional DSS support decisions that are recurring. An ad hoc DSS
supports querying data for one time requests. Hackathorn and Keen
(1981) identified DSS in three distinct yet interrelated categories:
Personal DSS, Group DSS and Organizational DSS.
In 1979, John Rockart of the Harvard Business School published a x
ground breaking article that led to the development of executive Please Ask
information systems (EISs) or executive support systems (ESS). Rockart ( कृपया पूछ )

23 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
developed the concept of using information systems to display critical
success metrics for managers.
Robert Bonczek, Clyde Holsapple, and Andrew Whinston (1981)
explained a theoretical framework for understanding the issues
associated with designing knowledge-oriented Decision Support
Systems. They identified four essential "aspects" or general
components that were common to all DSS: 1. A language system (LS)
that specifies all messages a specific DSS can accept; 2. A
presentation system (PS) for all messages a DSS can emit; 3. A
knowledge system (KS) for all knowledge a DSS has; and 4. A problem-
processing system (PPS) that is the "software engine" that tries to
recognize and solve problems during the use of a specific DSS. Their
book explained how Artificial Intelligence and Expert Systems
technologies were relevant to developing DSS.
Finally, Ralph Sprague and Eric Carlson’s (1982) book Building Effective
Decision Support Systems was an important milestone. Much of the
book further explained the Sprague (1980) DSS framework of data
base, model base and dialog generation and management software.
Also, it provided a practical, and understandable overview of how
organizations could and should build DSS.  Sprague and Carlson (1982)
defined DSS as "a class of information system that draws on
transaction processing systems and interacts with the other parts of
the overall information system to support the decision-making
activities of managers and other knowledge workers in organizations.
Architecture of DSS
Decision support system primarily consists of the following :
The Database
The database contains information about internal data and external
data that will contribute to the decision making process. This data is
in most cases more extensive than traditional relational models
The Model Base
This module contains a set of algorithms that makes decisions based
on the information in the database. This information is then
summarized and displayed as tables or graphs.
The Interface
This is what the user will use to interface with the system. This is
complimented with an interactive help and navigation screen.
Framework
DSS systems are not entirely different to other systems and require a
structured approach. A framework was provided by Sprague and
Watson (1993). The framework has three main levels. 1. Technology
levels 2. People involved 3. The developmental approach
1.Technology Levels
Sprague has suggested that there are three levels of hardware and
x
software that has been proposed for DSS.
Please Ask
a) Level 1 – Specific DSS ( कृपया पूछ )

24 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
This is the actual application that will be used to by the user. This is
the part of the application that allows the decision maker to make
decisions in a particular problem area.the use can act upon that
particular problem.
b) Level 2 – DSS Generator
This level contains Hardware/software environment that allows
people to easily develop specific DSS applications. This level makes
use of case tools or systems like Crystal
c) Level 3 – DSS Tools
Contains lower level hardware/software. DSS generators including
special languages, function libraries and linking modules
2. People Involved
Sprague suggests there are 5 roles involved in a typical DSS
development cycle.
a) The end user.
b) An intermediary.
c) DSS developer
d) Technical supporter
e) Systems Expert
3. Developmental
The developmental approach for a DSS system should be strongly
iterative. This will allow for the application to be changed and
redesigned at various intervals. The initial problem is used to design
the system on and then tested and revised to ensure the desired
outcome is achieved
Applications of DSS
DSS is extensively used in business and management. Executive
dashboard and other business performance software allow faster
decision making, identification of negative trends, and better
allocation of business resources.
A growing area of DSS application, concepts, principles, and
techniques is in agricultural production, marketing for sustainable
development. For example, the DSSAT4 package, developed through
financial support of USAID during the 80's and 90's, has allowed rapid
assessment of several agricultural production systems around the
world to facilitate decision-making at the farm and policy levels.
There are, however, many constraints to the successful adoption on
DSS in agriculture.
A specific example concerns the Canadian National Railway system,
which tests its equipment on a regular basis using a decision support
system. A problem faced by any railroad is worn-out or defective rails,
which can result in hundreds of derailments per year. Under a DSS, CN
managed to decrease the incidence of derailments at the same time x
other companies were experiencing an increase.
Please Ask
DSS has many applications that have already been spoken about. ( कृपया पूछ )

25 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
However, it can be used in any field where organization is necessary.
Additionally, a DSS can be designed to help make decisions on the
stock market, or deciding which area or segment to market a product
toward.
Characteristics and Capabilities of DSS
Because there is no exact definition of DSS, there is obviously no
agreement on the standard characteristics and capabilities of DSS.
Turban, E.,Aronson, and J.E. constitute an ideal set of characteristics
and capabilities of DSS. The key DSS characteristics and capabilities
are as follows:
a. Support for decision makers in semi-structured and
unstructured problems.
b. Support managers at all levels.
c. Support individuals and groups.
d. Support for interdependent or sequential decisions.
e. Support intelligence, design, choice, and
implementation.
f. Support variety of decision processes and styles.
g. DSS should be adaptable and flexible.
h. DSS should be interactive and provide ease of use.
i. Effectiveness balanced with efficiency (benefit must
exceed cost).
j. Complete control by decision-makers.
k. Ease of development by (modification to suit needs
and changing environment) end users.
l. Support modeling and analysis.
m. Data access.
n. Standalone, integration and Web-based.

Benefits of DSS
a. Improving Personal Efficiency
b. Expediting Problem Solving
c. Facilitating Interpersonal Communication
d. Promoting Learning or Training
e. Increasing Organizational Control
Executive Information System
An information system that consolidates and summarizes ongoing
x
transactions within the organization. It provides top management with
Please Ask
all the information it requires at all times from internal and external
( कृपया पूछ )

26 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
sources.

An Executive Information System (EIS) is a type of management


information system intended to facilitate and support the information
and decision making needs of senior executives by providing easy
access to both internal and external information relevant to meeting
the strategic goals of the organization. It is commonly considered as a
specialized form of a Decision Support System (DSS).
The emphasis of EIS is on graphical displays and easy-to-use user
interfaces. They offer strong reporting and drill-down capabilities. In
general, EIS are enterprise-wide DSS that help top-level executives
analyze, compare, and highlight trends in important variables so that
they can monitor performance and identify opportunities and
problems. EIS and data warehousing technologies are converging in
the marketplace.
In recent years, the term EIS has lost popularity in favour of Business
Intelligence (with the sub areas of reporting, analytics, and digital
dashboards).
Introduction
Many senior managers find that direct on-line access to organizational
data is helpful. For example, Paul Frech, president of Lockheed-
Georgia, monitors employee contributions to company-sponsored
programs (United Way, blood drives) as a surrogate measure of
employee morale (Houdeshel and Watson 1987). C. Robert Kidder, CEO
of Duracell, found that productivity problems were due to salespeople
in Germany wasting time calling on small stores and took corrective
action (Main 1989).
Information systems have long been used to gather and store
information, to produce specific reports for workers, and to produce
aggregate reports for managers. However, senior managers rarely use
these systems directly, and often find the aggregate information to be
of little use without the ability to explore underlying details (Watson
& Rainer 1991, Crockett 1992).
An Executive Information System (EIS) is a tool that provides direct
on-line access to relevant information in a useful and navigable
format. Relevant information is timely, accurate, and actionable
information about aspects of a business that are of particular interest
to the senior manager. The useful and navigable format of the system
means that it is specifically designed to be used by individuals with
limited time, limited keyboarding skills, and little direct experience
with computers. An EIS is easy to navigate so that managers can
identify broad strategic issues, and then explore the information to
find the root causes of those issues.
Executive Information Systems differ from traditional information
systems in the following ways:
· are specifically tailored to executive's information
x
needs
Please Ask
· are able to access data about specific issues and ( कृपया पूछ )

27 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
problems as well as aggregate reports
· provide extensive on-line analysis tools including
trend analysis, exception reporting & "drill-down"
capability
· access a broad range of internal and external data
· are particularly easy to use (typically mouse or
touchscreen driven)
· are used directly by executives without assistance
· present information in a graphical form
Purpose of EIS
The primary purpose of an Executive Information System is to support
managerial learning about an organization, its work processes, and its
interaction with the external environment. Informed managers can
ask better questions and make better decisions. Vandenbosch and Huff
(1992) from the University of Western Ontario found that Canadian
firms using an EIS achieved better business results if their EIS
promoted managerial learning. Firms with an EIS designed to maintain
managers' "mental models" were less effective than firms with an EIS
designed to build or enhance managers' knowledge.
This distinction is supported by Peter Senge in   The Fifth
Dimension.   He illustrates the benefits of learning about the
behaviour of systems versus simply learning more about their states.
Learning more about the state of a system leads to reactive
management fixes. Typically these reactions feed into the underlying
system behaviour and contribute to a downward spiral. Learning more
about system behaviour and how various system inputs and actions
interrelate will allow managers to make more proactive changes to
create long-term improvement.
A secondary purpose for an EIS is to allow timely access to
information. All of the information contained in an EIS can typically be
obtained by a manager through traditional methods. However, the
resources and time required to manually compile information in a
wide variety of formats, and in response to ever changing and ever
more specific questions usually inhibit managers from obtaining this
information. Often, by the time a useful report can be compiled, the
strategic issues facing the manager have changed, and the report is
never fully utilized.
Timely access also influences learning. When a manager obtains the
answer to a question, that answer typically sparks other related
questions in the manager's mind. If those questions can be posed
immediately, and the next answer retrieved, the learning cycle
continues unbroken. Using traditional methods, by the time the
answer is produced, the context of the question may be lost, and the
learning cycle will not continue. An executive in Rockart & Treacy's
1982 study noted that:
x
Your staff really can't help you think. The problem with giving
Please Ask
a question to the staff is that they provide you with the
( कृपया पूछ )

28 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
answer. You learn the nature of the real question you should
have asked when you muck around in the data (p. 9).
A third purpose of an EIS is commonly misperceived. An EIS has a
powerful ability to direct management attention to specific areas of
the organization or specific business problems. Some managers see
this as an opportunity to discipline subordinates. Some subordinates
fear the directive nature of the system and spend a great deal of time
trying to outwit or discredit it. Neither of these behaviours is
appropriate or productive. Rather, managers and subordinates can
work together to determine the root causes of issues highlighted by
the EIS.
The powerful focus of an EIS is due to the maxim "what gets measured
gets done." Managers are particularly attentive to concrete
information about their performance when it is available to their
superiors. This focus is very valuable to an organization if the
information reported is actually important and represents a balanced
view of the organization's objectives.
Misaligned reporting systems can result in inordinate management
attention to things that are not important or to things which are
important but to the exclusion of other equally important things. For
example, a production reporting system might lead managers to
emphasize volume of work done rather than quality of work. Worse
yet, productivity might have little to do with the organization's
overriding customer service objectives.
Contents of EIS
A general answer to the question of what data is appropriate for
inclusion in an Executive Information System is "whatever is
interesting to executives." While this advice is rather simplistic, it
does reflect the variety of systems currently in use. Executive
Information Systems in government have been constructed to track
data about Ministerial correspondence, case management, worker
productivity, finances, and human resources to name only a few.
Other sectors use EIS implementations to monitor information about
competitors in the news media and databases of public information in
addition to the traditional revenue, cost, volume, sales, market share
and quality applications.
Frequently, EIS implementations begin with just a few measures that
are clearly of interest to senior managers, and then expand in
response to questions asked by those managers as they use the
system. Over time, the presentation of this information becomes
stale, and the information diverges from what is strategically
important for the organization. A "Critical Success Factors" approach is
recommended by many management theorists (Daniel, 1961,
Crockett, 1992, Watson and Frolick, 1992). Practitioners such
as Vandenbosch (1993) found that:
While our efforts usually met with initial success, we often
found that after six months to a year, executives were almost x
as bored with the new information as they had been with the Please Ask
old. A strategy we developed to rectify this problem required ( कृपया पूछ )

29 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
organizations to create a report of the month. That is, in
addition to the regular information provided for management
committee meetings, the CEO was charged with selecting a
different indicator to focus on each month (Vandenbosch,
1993, pp. 8-9).
While the above indicates that selection of data for inclusion in an EIS
is difficult, there are several guidelines that help to make that
assessment. A practical set of principles to guide the design of
measures and indicators to be included in an EIS is presented below
(Kelly, 1992b). For a more detailed discussion of methods for selecting
measures that reflect organizational objectives, see the section "EIS
and Organizational Objectives."
1. EIS measures must be easy to understand and collect.
Wherever possible, data should be collected naturally as
part of the process of work. An EIS should not add
substantially to the workload of managers or staff.
2. EIS measures must be based on a balanced view of
the organization's objective. Data in the system should
reflect the objectives of the organization in the areas of
productivity, resource management, quality and
customer service.
3. Performance indicators in an EIS must reflect
everyone's contribution in a fair and consistent manner.
Indicators should be as independent as possible from
variables outside the control of managers.
4. EIS measures must encourage management and staff
to share ownership of the organization's objectives.
Performance indicators must promote both team-work
and friendly competition. Measures will be meaningful
for all staff; people must feel that they, as individuals,
can contribute to improving the performance of the
organization.
5. EIS information must be available to everyone in the
organization. The objective is to provide everyone with
useful information about the organization's performance.
Information that must remain confidential should not be
part of the EIS or the management system of the
organization.
6. EIS measures must evolve to meet the changing needs
of the organization.
Barriers to Effectiveness
There are many ways in which an EIS can fail. Dozens of high profile,
high cost EIS projects have been cancelled, implemented and rarely
used, or implemented and used with negative results. An EIS is a high
risk project precisely because it is intended for use by the most
powerful people in an organization. Senior managers can easily misuse x
the information in the system with strongly detrimental effects on the Please Ask
organization. Senior managers can refuse to use a system if it does not ( कृपया पूछ )

30 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
respond to their immediate personal needs or is too difficult to learn
and use.
Unproductive Organizational Behaviour Norms
Issues of organizational behaviour and culture are perhaps the most
deadly barriers to effective Executive Information Systems. Because
an EIS is typically positioned at the top of an organization, it can
create powerful learning experiences and lead to drastic changes in
organizational direction. However, there is also great potential for
misuse of the information. Green, Higgins and Irving (1988) found that
performance monitoring can promote bureaucratic and unproductive
behaviour, can unduly focus organizational attention to the point
where other important aspects are ignored, and can have a strongly
negative impact on morale.
The key barrier to EIS effectiveness, therefore, is the way in which
the organization uses the information in the system. Managers must be
aware of the dangers of statistical data, and be skilled at interpreting
and using data in an effective way. Even more important is the
manager's ability to communicate with others about statistical data in
a non-defensive, trustworthy, and constructive manner.   Argyris
(1991) suggests a universal human tendency towards strategies that
avoid embarrassment or threat, and towards feelings of vulnerability
or incompetence. These strategies include:
· stating criticism of others in a way that you feel is
valid but also in a way that prevents others from
deciding for themselves
· failing to include any data that others could use to
objectively evaluate your criticism
· stating your conclusions in ways that disguise their
logical implications and denying those implications if
they are suggested
To make effective use of an EIS, mangers must have the self-
confidence to accept negative results and focus on the resolution of
problems rather than on denial and blame. Since organizations with
limited exposure to planning and targeting, data-based decision-
making, statistical process control, and team-based work models may
not have dealt with these behavioral issues in the past, they are more
likely to react defensively and reject an EIS.
Technical Excellence
An interesting result from the  Vandenbosch & Huff (1988)   study
was that the technical excellence of an EIS has an inverse relationship
with effectiveness. Systems that are technical masterpieces tend to
be inflexible, and thus discourage innovation, experimentation and
mental model development.
Flexibility is important because an EIS has such a powerful ability to
direct attention to specific issues in an organization. A technical
masterpiece may accurately direct management attention when the x
system is first implemented, but continue to direct attention to issues Please Ask
that were important a year ago on its first anniversary. There is ( कृपया पूछ )

31 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
substantial danger that the exploration of issues necessary for
managerial learning will be limited to those subjects that were
important when the EIS was first developed. Managers must
understand that as the organization and its work changes, an EIS must
continually be updated to address the strategic issues of the day.
A number of explanations as to why technical masterpieces tend to be
less flexible are possible. Developers who create a masterpiece EIS
may become attached to the system and consciously or unconsciously
dissuade managers from asking for changes. Managers who are
uncertain that the benefits outweigh the initial cost of a masterpiece
EIS may not want to spend more on system maintenance and
improvements. The time required to create a masterpiece EIS may
mean that it is outdated before it is implemented.
While usability and response time are important factors in
determining whether executives will use a system, cost and flexibility
are paramount. A senior manager will be more accepting of an
inexpensive system that provides 20% of the needed information
within a month or two than with an expensive system that provides
80% of the needed information after a year of development. The
manager may also find that the inexpensive system is easier to change
and adapt to the evolving needs of the business. Changing a large
system would involve throwing away parts of a substantial investment.
Changing the inexpensive system means losing a few weeks of work.
As a result, fast, cheap, incremental approaches to developing an EIS
increase the chance of success.
Technical Problems
Paradoxically, technical problems are also frequently reported as a
significant barrier to EIS success. The most difficult technical problem
-- that of integrating data from a wide range of data sources both
inside and outside the organization -- is also one of the most critical
issues for EIS users. A marketing vice-president, who had spent several
hundred thousand dollars on an EIS, attended a final briefing on the
system. The technical experts demonstrated the many graphs and
charts of sales results, market share and profitability. However, when
the vice-president asked for a graph of market share and advertising
expense over the past ten years, the system was unable to access
historical data. The project was cancelled in that meeting.
The ability to integrate data from many different systems is important
because it allows managerial learning that is unavailable in other
ways. The president of a manufacturing company can easily get
information about sales and manufacturing from the relevant VPs.
Unfortunately, the information the president receives will likely be
incompatible, and learning about the ways in which sales and
manufacturing processes influence each other will not be easy. An EIS
will be particularly effective if it can overcome this challenge,
allowing executives to learn about business processes that cross
organizational boundaries and to compare business results in disparate
functions. x
Another technical problem that can kill EIS projects is usability. Senior Please Ask
managers simply have the choice to stop using a system if they find it ( कृपया पूछ )

32 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
too difficult to learn or use. They have very little time to invest in
learning the system, a low tolerance for errors, and initially may have
very little incentive to use it. Even if the information in the system is
useful, a difficult interface will quickly result in the manager assigning
an analyst to manipulate the system and print out the required
reports. This is counter-productive because managerial learning is
enhanced by the immediacy of the question - answer learning cycle
provided by an EIS. If an analyst is interacting with the system, the
analyst will acquire more learning than the manager, but will not be in
a position to put that learning to its most effective use.
Usability of Executive Information Systems can be enhanced through
the use of prototyping and usability evaluation methods. These
methods ensure that clear communication occurs between the
developers of the system and its users. Managers have an opportunity
to interact with systems that closely resemble the functionality of the
final system and thus can offer more constructive criticism than they
might be able to after reading an abstract specification document.
Systems developers also are in a position to listen more openly to
criticisms of a system since a prototype is expected to be disposable.
Several evaluation protocols are available including observation and
monitoring, software logging, experiments and benchmarking, etc.
(Preece et al, 1994). The most appropriate methods for EIS design are
those with an ethnographic flavour because the experience base of
system developers is typically so different from that of their user
population (senior executives).
Misalignment Between Objectives & EIS
A final barrier to EIS effectiveness was mentioned earlier in the
section on purpose. As noted there, the powerful ability of an EIS to
direct organizational attention can be destructive if the system
directs attention to the wrong variables. There are many examples of
this sort of destructive reporting. Grant, Higgins and Irving (1988)
report the account of an employee working under a misaligned
reporting system.
I like the challenge of solving customer problems, but they get
in the way of hitting my quota. I'd like to get rid of the
telephone work. If (the company) thought dealing with
customers was important, I'd keep it; but if it's just going to
be production that matters, I'd gladly give all the calls to
somebody else.
Traditional cost accounting systems are also often misaligned with
organizational objectives, and placing these measures in an EIS will
continue to draw attention to the wrong things. Cost accounting
allocates overhead costs to direct labour hours. In some cases the
overhead burden on each direct labour hour is as much as 1000%. A
manager operating under this system might decide to sub-contract 100
hours of direct labor at $20 per hour. On the books, this $2,000 saving
is accompanied by $20,000 of savings in overhead. If the sub-
contractor charges $5,000 for the work, the book savings are $2,000 + x
$20,000 - $5,000 = $17,000. In reality, however, the overhead costs for Please Ask
an idle machine in a factory do not go down much at all. The sub- ( कृपया पूछ )

33 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
contract actually ends up costing $5,000 - $2,000 = $3,000. (Peters,
1987)
Characteristics of Successful EIS Implementations
Find an Appropriate Executive Champion
EIS projects that succeed do so because at least one member of the
senior management team agrees to champion the project. The
executive champion need not fully understand the technical issues,
but must be a person who works closely with all of the senior
management team and understands their needs, work styles and their
current methods of obtaining organizational information. The
champion's commitment must include a willingness to set aside time
for reviewing prototypes and implementation plans, influencing and
coaching other members of the senior management team, and
suggesting modifications and enhancements to the system.
Deliver a Simple Prototype Quickly
Executives judge a new EIS on the basis of how easy it is to use and
how relevant the information in the system is to the current strategic
issues in the organization. As a result, the best EIS projects begin as a
simple prototype, delivered quickly, that provides data about at least
one critical issue. If the information delivered is worth the hassle of
learning the system, a flurry of requirements will shortly be generated
by executives who like what they see, but want more. These requests
are the best way to plan an EIS that truly supports the organization,
and are more valuable than months of planning by a consultant or
analyst.
One caveat concerning the simple prototype approach is that
executive requests will quickly scatter to questions of curiosity rather
than strategy in an organization where strategic direction and
objectives are not clearly defined. A number of methods are available
to support executives in defining business objectives and linking them
to performance monitors in an EIS. These are discussed further in the
section on EIS and Organizational Objectives below.
Involve Your Information Systems Department
In some organizations, the motivation for an EIS project arises in the
business units quite apart from the traditional information systems (IS)
organization. Consultants may be called in, or managers and analysts
in the business units may take the project on without consulting or
involving IS. This is a serious mistake. Executive Information Systems
rely entirely on the information contained in the systems created and
maintained by this department. IS professionals know best what
information is available in an organization's systems and how to get it.
They must be involved in the team. Involvement in such a project can
also be beneficial to IS by giving them a more strategic perspective on
how their work influences the organization.
Communicate & Train to Overcome Resistance
A final characteristic of successful EIS implementations is that of x
communication. Executive Information Systems have the potential to Please Ask
drastically alter the prevailing patterns of organizational ( कृपया पूछ )

34 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
communication and thus will typically be met with resistance. Some
of this resistance is simply a matter of a lack of knowledge. Training
on how to use statistics and performance measures can help.
However, resistance can also be rooted in the feelings of fear,
insecurity and cynicism experienced by individuals throughout the
organization. These attitudes can only be influenced by a strong and
vocal executive champion who consistently reinforces the purpose of
the system and directs the attention of the executive group away
from unproductive and punitive behaviours.
EIS and Organizational Culture
Henry Mintzberg (1972) has argued that impersonal statistical data is
irrelevant to managers. John Dearden (1966) argued that the promise
of real-time management information systems was a myth and would
never be of use to top managers. Grant, Higgins, and Irving (1988)
argue that computerized performance monitors undermine trust,
reduce autonomy and fail to illuminate the most important issues.
Many of these arguments against EISs have objective merit. Manager's
really do value the tangible tidbits of detail they encounter in their
daily interactions more highly than abstract numerical reports.
Rumours suggest a future, while numbers describe a past.
Conversations are rich in detail and continuously probe the reasons for
the situation, while statistics are vague approximations of reality.
When these vague approximations are used to intimidate or control
behavior rather than to guide learning, they really do have a negative
impact on the organization.
Yet both of these objections point to a deeper set of problems -- the
assumptions, beliefs, values and behaviors that people in the
organization hold and use to respond to their environment. Perhaps
senior managers find statistical data to be irrelevant because they
have found too many errors in previous reports?  Perhaps people in the
organization prefer to assign blame rather than discover the true root
cause of problems.   The culture of an organization can have a
dramatic influence on the adoption and use of an Executive
Information System.   The following cultural characteristics will
contribute directly to the success or failure of an EIS project.
Learning Vs Blaming
A learning organization is one that seeks first to understand why a
problem occurred, and not who is to blame. It is a common and
natural response for managers to try to deflect responsibility for a
problem on to someone else. An EIS can help to do this by indicating
very specifically who failed to meet a statistical target, and by how
much. A senior manager, armed with EIS data, can intimidate and
blame the appropriate person. The blamed person can respond by
questioning the integrity of the system, blaming someone else, or
even reacting in frustration by slowing work down further.
In a learning organization, any unusual result is seen as an opportunity
to learn more about the business and its processes. Managers who find
x
an unusual statistic explore it further, breaking it down to understand
Please Ask
its components and comparing it with other numbers to establish
( कृपया पूछ )

35 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
cause and effect relationships. Together as a team, management uses
numerical results to focus learning and improve business processes
across the organization. An EIS facilitates this approach by allowing
instant exploration of a number, its components and its relationship to
other numbers.
Continuous Improvement Vs Crisis Management
Some organizations find themselves constantly reacting to crises, with
little time for any proactive measures. Others have managed to
respond to each individual crisis with an approach that prevents other
similar problems in the future. They are engaged in a continual cycle
of improving business practices and finding ways to avoid crisis.
Crises in government are frequently caused by questions about
organizational performance raised by an auditor, the Minister, or
members of the Opposition. An EIS can be helpful in responding to this
sort of crisis by providing instant data about the actual facts of the
situation. However, this use of the EIS does little to prevent future
crises.
An organizational culture in which continual improvement is the norm
can use the EIS as an early warning system pointing to issues that have
not yet reached the crisis point, but are perhaps the most important
areas on which to focus management attention and learning.
Organizations with a culture of continuous improvement already have
an appetite for the sort of data an EIS can provide, and thus will
exhibit less resistance.
Team Work Vs Hierarchy
An EIS has the potential to substantially disrupt an organization that
relies upon adherence to a strict chain of command. The EIS provides
senior managers with the ability to micro-manage details at the
lowest levels in the organization. A senior manger with an EIS report
who is surprised at the individual results of a front-line worker might
call that person directly to understand why the result is unusual. This
could be very threatening for the managers between the senior
manager and the front-line worker. An EIS can also provide lower level
managers with access to information about peer performance and
even the performance of their superiors.
Organizations that are familiar with work teams, matrix managed
projects and other forms of interaction outside the chain of command
will find an EIS less disruptive. Senior managers in these organizations
have learned when micro-management is appropriate and when it is
not. Middle managers have learned that most interactions between
their superiors and their staff are not threatening to their position.
Workers are more comfortable interacting with senior managers when
the need arises, and know what their supervisor expects from them in
such an interaction.
Data-based Decisions Vs Decisions in a Vacuum
The total quality movement, popular in many organizations today,
x
emphasizes a set of tools referred to as Statistical Process Control
(SPC). These analytical tools provide managers and workers with Please Ask
( कृपया पूछ )

36 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
methods of understanding a problem and finding solutions rather than
allocating blame and passing the buck. Organizations with training and
exposure to SPC and analytical tools will be more open to an EIS than
those who are suspicious of numerical measures and the motives of
those who use them.
It should be noted that data-based decision making does not deny the
role of intuition, experience, or negotiation amongst a group. Rather,
it encourages decision-makers to probe the facts of a situation further
before coming to a decision. Even if the final decision contradicts the
data, chances are that an exploration of the data will help the
decision-maker to understand the situation better before a decision is
reached. An EIS can help with this decision-making process.
Information Sharing Vs Information Hoarding
Information is power in many organizations, and managers are
motivated to hoard information rather than to share it widely. For
example, managers may hide information about their own
organizational performance, but jump at any chance to see
information about performance of their peers.
A properly designed EIS promotes information sharing throughout the
organization. Peers have access to information about each other's
domain; junior managers have information about how their
performance contributes to overall organizational performance. An
organization that is comfortable with information sharing will have
developed a set of "good manners" for dealing with this broad access
to information. These behavioral norms are keys to the success of an
EIS.
Specific Objectives Vs Vague Directions
An organization that has experience developing and working toward
Specific, Measurable, and Achievable and Consistent (SMAC)
objectives will also find an EIS to be less threatening. Many
organizations are uncomfortable with specific performance measures
and targets because they believe their work to be too specialized or
unpredictable. Managers in these organizations tend to adopt vague
generalizations and statements of the exceedingly obvious in place of
SMAC objectives that actually focus and direct organizational
performance. In a few cases, it may actually be true that numerical
measures are completely inappropriate for a few aspects of the
business. In most cases, managers with this attitude have a poor
understanding of the purpose of objective and target-setting
exercises. Some business processes are more difficult to measure and
set targets for than others. Yet almost all business processes have at
least a few characteristics that can be measured and improved
through conscientious objective setting. (See the following section on
EIS and Organizational Objectives.)
EIS and Organizational Objectives
A number of writers have discovered that one of the major difficulties
with EIS implementations is that the information contained in the EIS x
either does not meet executive requirements, or meets executive Please Ask
requirements, but fails to guide the organization towards its ( कृपया पूछ )

37 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
objectives. As discussed earlier, organizations that are comfortable in
establishing and working towards Specific, Measurable, Achievable,
and Consistent (SMAC) objectives will find it easier to create an EIS
that actually drives organizational performance. Yet even these
organizations may have difficulty because their stated objectives do
not represent all of the things that are important.
Crockett (1992) suggests a four step process for developing EIS
information requirements based on a broader understanding of
organizational objectives. The steps are: (1) identify critical success
factors and stakeholder expectations, (2) document performance
measures that monitor the critical success factors and stakeholder
expectations, (3) determine reporting formats and frequency, and (4)
outline information flows and how information can be used. Crockett
begins with stakeholders to ensure that all relevant objectives and
critical success factors are reflected in the EIS.
Kaplan and Norton (1992) suggest that goals and measures need to be
developed from each of four perspectives: financial, customer,
internal business, and innovation and learning. These perspectives
help managers to achieve a balance in setting objectives, and
presenting them in a unified report exposes the tough tradeoffs in any
management system. An EIS built on this basis will not promote
productivity while ignoring quality, or customer satisfaction while
ignoring cost.
Meyer (1994) raises several questions that should be asked about
measurement systems for teams. Four are appropriate for evaluating
objectives and measures represented in an EIS. They are:
· Are all critical organizational outcomes tracked?
· Are all "out-of-bounds" conditions tracked?
(Conditions those are serious enough to trigger a
management review.)
· Are all the critical variables required to reach each
outcome tracked?
· Is there any measure that would not cause the
organization to change its behavior?
In summary, proper definition of organizational objectives and
measures is a helpful precondition for reducing organizational
resistance to an EIS and is the root of effective EIS use. The benefits
of an EIS will be fully realized only when it helps to focus management
attention on issues of true importance to the organization.
Methodology
Implementation of an effective EIS requires clear consensus on the
objectives and measures to be monitored in the system and a plan for
obtaining the data on which those measures are based. The sections
below outline a methodology for achieving these two results. As noted
earlier, successful EIS implementations generally begin with a simple
prototype rather than a detailed planning process. For that reason, x
the proposed planning methodologies are as simple and scope-limited Please Ask
as possible. ( कृपया पूछ )

38 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
EIS Project Team
The process of establishing organizational objectives and measures is
intimately linked with the task of locating relevant data in existing
computer systems to support those measures. Objectives must be
specific and measurable, and data availability is critical to measuring
progress against objectives.
Since there is little use in defining measures for which data is not
available, it is recommended that an EIS project team including
technical staff be established at the outset. This cross-functional
team can provide early warning if data is not available to support
objectives or if senior manager's expectations for the system are
impractical.
A preliminary EIS project team might consist of as few as three
people. An EIS Project Leader organizes and directs the project. An
Executive Sponsor promotes the project in the organization,
contributes senior management requirements on behalf of the senior
management team, and reviews project progress regularly. A
Technical Leader participates in requirements gathering, reviewing
plans, and ensuring technical feasibility of all proposals during EIS
definition.
As the focus of the project becomes more technical, the EIS project
team may be complemented by additional technical staff who will be
directly involved in extracting data from legacy systems and
constructing the EIS data repository and user interface.
Establishing Measures & EIS Requirements
Most organizations have a number of high-level objectives and
direction statements that help to shape organizational behavior and
priorities. In many cases, however, these direction statements have
not yet been linked to performance measures and targets. As well,
senior managers may have other critical information requirements
that would not be reflected in a simple analysis of existing direction
statements. Therefore it is essential that EIS requirements be derived
directly from interaction with the senior managers who will use the
systems. It is also essential that practical measures of progress
towards organizational objectives be established during these
interactions.
Measures and EIS requirements are best established through a three-
stage process. First, the EIS team solicits the input of the most senior
executives in the organization in order to establish a broad, top-down
perspective on EIS requirements. Second, interviews are conducted
with the managers who will be most directly involved in the
collection, analysis, and monitoring of data in the system to assess
bottom-up requirements. Third, a summary of results and
recommendations is presented to senior executives and operational
managers in a workshop where final decisions are made.
Interview Format
x
The focus of the interviews would be to establish all of the measures
managers require in the EIS. Questions would include the following: Please Ask
( कृपया पूछ )

39 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
1. What are the five most important pieces of
information you need to do your job?
2. What expectations does the Board of Directors have
for you?
3. What results do you think the general public expects
you to accomplish?
4. On what basis would consumers and customers judge
your effectiveness?
5. What expectations do other stakeholders impose on
you?
6. What is it that you have to accomplish in your current
position?
Senior Management Workshop
Since considerable variability is expected in the results of these
interviews, analysis and synthesis are required to identify recurring
themes and important differences of opinion. This information is
brought forward to a senior management workshop.
The purpose of the senior management workshop is twofold. First, the
workshop will be an opportunity to educate senior management on
appropriate use of Executive Information Systems, to address some of
the cultural issues raised earlier and to deal directly with resistance
to the system. Second, managers at the workshop will be asked to
reach agreement on an initial set of measures to be included in the
EIS. The education component of the workshop is most effective if
integrated with the work of creating measures.
The initial set of measures will be established within a framework
derived from the interview process. Three to five categories of
measures will be established prior to the workshop, and managers will
be asked to select or create three to five technically feasible
measures for each category. Each of the proposed measures will be
subjected to the questions proposed by Meyer (see EIS and
Organizational Objectives above) to determine if they are
appropriate.
Technical staff will attend to respond to feasibility questions as they
arise, and to improve their understanding of the EIS requirements.
Obtaining Critical Data Linking EIS Measures to Data
Sources
Data to support the information requirements of senior managers will
likely be dispersed across the organization's information systems and
external sources. Some data may not be currently available at all, and
collection mechanisms will have to be constructed.
The EIS project team, augmented by technical experts, and working
from the requirements established in the senior management
workshop will develop a list of required data elements and link them
with appropriate data sources. The team will then establish x
requirements for data extraction from each of these systems and spin Please Ask
off appropriate systems development projects. ( कृपया पूछ )

40 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
EIS Design, Prototyping & Evaluation
After information sources have been established, and projects are
underway to permit ongoing extraction of that information, attention
will turn to the design of the EIS itself. There are several components
to consider.

Hardware
First, an inventory of computers used by executives must be taken to
determine what upgrades are necessary and what hardware
limitations will be imposed on the EIS design. Included in this
inventory will be an assessment of network storage and
communication facilities.
Data Repository
The second component is the design of the data repository in which
summary data from all sources will be stored. The design of this
repository is critical because it must allow managers to easily extract
and explore data along numerous dimensions. Standard relational
designs may not be sufficient or practical for this application.
EIS Interface Prototype
A third component is the design of the actual EIS interface that senior
managers will interact with. Screens and commands must be
exceedingly obvious and easy to use so that senior managers can
quickly access the benefits of the system without wasting a lot of time
learning how to use it. Ease of use can be ensured by developing a
prototype system with "sample" data, and watching senior managers
as they interact with the prototype. Two to three iterations of
prototype redesign and testing with four senior managers would be
sufficient to ensure that the system is easy to use.
Advantages of EIS
a. Easy for upper-level executives to use, extensive
computer experience is not required in operations
b. Provides timely delivery of company summary
information
c. Information that is provided is better understood
d. Filters data for management
e. Improves to tracking information
f. Offers efficiency to decision makers
Disadvantages of EIS
a. Functions are limited, cannot perform complex
calculations
b. Hard to quantify benefits and to justify x
implementation of an EIS Please Ask
( कृपया पूछ )

41 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
c. Executives may encounter information overload
d. System may become slow, large, and hard to manage
e. Difficult to keep current data
f. May lead to less reliable and insecure data
g. Small companies may encounter excessive costs for
implementation
h. Highly skilled personnel requirement can not be
fulfilled by the small business

Future Scope of EIS


The future of executive info systems will not be bound by mainframe
computer systems. This trend allows executives escaping from
learning different computer operating systems and substantially
decreases the implementation costs for companies. Because utilizing
existing software applications lies in this trend, executives will also
eliminate the need to learn a new or special language for the EIS
package. Future executive information systems will not only provide a
system that supports senior executives, but also contain the
information needs for middle managers. The future executive
information systems will become diverse because of integrating
potential new applications and technology into the systems, such as
incorporating artificial intelligence (AI) and integrating multimedia
characteristics and ISDN technology into an EIS.
DATA WAREHOUSING
The data warehousing market consists of tools, technologies, and
methodologies that allow for the construction, usage, management,
and maintenance of the hardware and software used for a data
warehouse, as well as the actual data itself. Surveys indicate Data
Warehousing will be the single largest IT initiative after completion of
Y2K efforts. Data warehousing is currently a $28 Billion market
(Source: Data Warehousing Institute) and we estimate 20% growth per
annum through at least 2002.Two of the pioneers in the field were
Ralph Kimball and Bill Inmon. Biographies of these two individuals
have been provided, since many of the terms discussed in this paper
were coined and concepts defined by them.
Data warehousing is combining data from multiple and usually varied
sources into one comprehensive and easily manipulated database.
Common accessing systems of data warehousing include queries,
analysis and reporting. Because data warehousing creates one
database in the end, the number of sources can be anything you want
it to be, provided that the system can handle the volume, of course.
The final result, however, is homogeneous data, which can be more
easily manipulated.
Data warehousing is commonly used by companies to analyze trends
over time. In other words, companies may very well use data x
warehousing to view day-to-day operations, but its primary function is Please Ask
facilitating strategic planning resulting from long-term data ( कृपया पूछ )

42 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
overviews. From such overviews, business models, forecasts, and
other reports and projections can be made. Routinely, because the
data stored in data warehouses is intended to provide more overview-
like reporting, the data is read-only. If you want to update the data
stored via data warehousing, you'll need to build a new query when
you're done.
This is not to say that data warehousing involves data that is never
updated. On the contrary, the data stored in data warehouses is
updated all the time. It's the reporting and the analysis that take
more of a long-term view.
Data warehousing is not the be-all and end-all for storing all of a
company's data. Rather, data warehousing is used to house the
necessary data for specific analysis. More comprehensive data storage
requires different capacities that are more static and less easily
manipulated than those used for data warehousing.
Data warehousing is typically used by larger companies analyzing
larger sets of data for enterprise purposes. Smaller companies wishing
to analyze just one subject, for example, usually access data marts,
which are much more specific and targeted in their storage and
reporting. Data warehousing often includes smaller amounts of data
grouped into data marts. In this way, a larger company might have at
its disposal both data warehousing and data marts, allowing users to
choose the source and functionality depending on current needs.
In order to clear up some of the confusion that is rampant in the
market, here are some definitions:
Definition of Data Warehouse:
The term Data Warehouse was coined by Bill Inmon in 1990, which he
defined in the following way: "A warehouse is a subject-oriented,
integrated, time-variant and non-volatile collection of data in support
of management's decision making process". He defined the terms in
the sentence as follows:
Subject Oriented
Data that gives information about a particular subject instead of
about a company's ongoing operations.
Integrated
Data that is gathered into the data warehouse from a variety of
sources and merged into a coherent whole.
Time-variant
All data in the data warehouse is identified with a particular time
period.
Non-volatile
Data is stable in a data warehouse. More data is added but data is
never removed. This enables management to gain a consistent picture
of the business.
x
Please Ask
( कृपया पूछ )

43 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
(Source: "What is a Data Warehouse?" W.H. Inmon, Prism,
Volume 1, Number 1, 1995).
Data warehousing is essentially what you need to do in order to create
a data warehouse, and what you do with it. It is the process of
creating, populating, and then querying a data warehouse and can
involve a number of discrete technologies such as:
Source System Identification
Source System Identification: In order to build the data warehouse,
the appropriate data must be located. Typically, this will involve both
the current OLTP (On-Line Transaction Processing) system where the
"day-to-day" information about the business resides, and historical
data for prior periods, which may be contained in some form of
"legacy" system. Often these legacy systems are not relational
databases, so much effort is required to extract the appropriate data.
Data Warehouse Design and Creation
This describes the process of designing the warehouse, with care
taken to ensure that the design supports the types of queries the
warehouse will be used for. This is an involved effort that requires
both an understanding of the database schema to be created, and a
great deal of interaction with the user community. The design is often
an iterative process and it must be modified a number of times before
the model can be stabilized. Great care must be taken at this stage,
because once the model is populated with large amounts of data,
some of which may be very difficult to recreate, the model can not
easily be changed.
Data Acquisition
This is the process of moving company data from the source systems
into the warehouse. It is often the most time-consuming and costly
effort in the data warehousing project, and is performed with
software products known as ETL (Extract/Transform/Load) tools.
There are currently over 50 ETL tools on the market. The data
acquisition phase can cost millions of dollars and take months or even
years to complete. Data acquisition is then an ongoing, scheduled
process, which is executed to keep the warehouse current to a pre-
determined period in time, (i.e. the warehouse is refreshed monthly).
Changed Data Capture
The periodic update of the warehouse from the transactional
system(s) is complicated by the difficulty of identifying which records
in the source have changed since the last update. This effort is
referred to as "changed data capture". Changed data capture is a field
of endeavor in itself, and many products are on the market to address
it. Some of the technologies that are used in this area are Replication
servers, Publish/Subscribe, Triggers and Stored Procedures, and
Database Log Analysis.
Data Cleansing
This is typically performed in conjunction with data acquisition (it can x

be part of the "T" in "ETL"). A data warehouse that contains incorrect Please Ask
( कृपया पूछ )

44 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
data is not only useless, but also very dangerous. The whole idea
behind a data warehouse is to enable decision-making. If a high level
decision is made based on incorrect data in the warehouse, the
company could suffer severe consequences, or even complete failure.
Data cleansing is a complicated process that validates and, if
necessary, corrects the data before it is inserted into the warehouse.
For example, the company could have three "Customer Name" entries
in its various source systems, one entered as "IBM", one as "I.B.M.",
and one as "International Business Machines". Obviously, these are all
the same customer. Someone in the organization must make a decision
as to which is correct, and then the data cleansing tool will change
the others to match the rule. This process is also referred to as "data
scrubbing" or "data quality assurance". It can be an extremely complex
process, especially if some of the warehouse inputs are from older
mainframe file systems (commonly referred to as "flat files" or
"sequential files").
Data Aggregation
This process is often performed during the "T" phase of ETL, if it is
performed at all. Data warehouses can be designed to store data at
the detail level (each individual transaction), at some aggregate level
(summary data), or a combination of both. The advantage of
summarized data is that typical queries against the warehouse run
faster. The disadvantage is that information, which may be needed to
answer a query, is lost during aggregation. The tradeoff must be
carefully weighed, because the decision can not be undone without
rebuilding and repopulating the warehouse. The safest decision is to
build the warehouse with a high level of detail, but the cost in storage
can be extreme.
Now that the warehouse has been built and populated, it becomes
possible to extract meaningful information from it that will provide a
competitive advantage and a return on investment. This is done with
tools that fall within the general rubric of "Business Intelligence".
Business Intelligence (BI)
A very broad field indeed, it contains technologies such as Decision
Support Systems (DSS), Executive Information Systems (EIS), On-Line
Analytical Processing (OLAP), Relational OLAP (ROLAP), Multi-
Dimensional OLAP (MOLAP), Hybrid OLAP (HOLAP, a combination of
MOLAP and ROLAP), and more. BI can be broken down into four broad
fields:
Multi-dimensional Analysis Tools
Tools that allow the user to look at the data from a number of
different "angles" are called Multi-dimensional Analysis tools. These
tools often use a multi-dimensional database referred to as a "cube".
Query tools
Tools that allow the user to issue SQL (Structured Query Language)
queries against the warehouse and get a result set back.
x
Data Mining Tools Please Ask
Tool that automatically searches for patterns in data is called data ( कृपया पूछ )

45 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
mining Tools. These tools are usually driven by complex statistical
formulas. The easiest way to distinguish data mining from the various
forms of OLAP is that OLAP can only answer questions you know to
ask, data mining answers questions you didn't necessarily know to ask.
Data Visualization Tools
Tools that show graphical representations of data, including complex
three-dimensional data pictures is called Data Visualization tools. The
theory is that the user can "see" trends more effectively in this
manner than when looking at complex statistical graphs. Some
vendors are making progress in this area using the Virtual Reality
Modeling Language (VRML).
Metadata Management
Throughout the entire process of identifying, acquiring, and querying
the data, metadata management takes place. Metadata is defined as
"data about data". An example is a column in a table. The data type
(for instance a string or integer) of the column is one piece of
metadata. The name of the column is another. The actual value in the
column for a particular row is not metadata - it is data. Metadata is
stored in a Metadata Repository and provides extremely useful
information to all of the tools mentioned previously. Metadata
management has developed into an exacting science that can provide
huge returns to an organization. It can assist companies in analyzing
the impact of changes to database tables, tracking owners of
individual data elements ("data stewards"), and much more. It is also
required to build the warehouse, since the ETL tool needs to know the
metadata attributes of the sources and targets in order to "map" the
data properly. The BI tools need the metadata for similar reasons.
Data Warehousing is a complex field, with many vendors vying for
market awareness. The complexity of the technology and the
interactions between the various tools, and the high price points for
the products require companies to perform careful technology
evaluation before embarking on a warehousing project. However, the
potential for enormous returns on investment and competitive
advantage make data warehousing difficult to ignore
History of Data warehousing
Data Warehouses are a distinct type of computer database that were
first developed during the late 1980s and early 1990s. They were
developed to meet a growing demand for management information
and analysis that could not be met by operational systems.
Operational systems were unable to meet this need for a range of
reasons:
a) The processing load of reporting reduced the response time
of the operational systems
b) The database designs of operational systems were not
optimized for information analysis and reporting
c) Most organizations had more than one operational system, x
so company-wide reporting could not be supported from a Please Ask
single system ( कृपया पूछ )

46 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
d) Development of reports in operational systems often
required writing specific computer programs which was slow
and expensive
As a result, separate computer databases began to be built that were
specifically designed to support management information and analysis
purposes. These data warehouses were able to bring in data from a
range of different data sources, such as mainframe computers,
minicomputers, as well as personal computers and office automation
software such as spreadsheet, and integrate this information in a
single place. This capability, coupled with user-friendly reporting tools
and freedom from operational impacts, has led to a growth of this
type of computer system.
As technology improved (lower cost for more performance) and user
requirements increased (faster data load cycle times and more
features), data warehouses have evolved through several fundamental
stages:
Off line Operational Databases
Data warehouses in this initial stage are developed by simply copying
the database of an operational system to an off-line server where the
processing load of reporting does not impact on the operational
system's performance.
Off line Data Warehouse
Data warehouses in this stage of evolution are updated on a regular
time cycle (usually daily, weekly or monthly) from the operational
systems and the data is stored in an integrated reporting-oriented
data structure.
 
 
Real Time Data Warehouse
Data warehouses at this stage are updated on a transaction or event
basis, every time an operational system performs a transaction (e.g.
an order or a delivery or a booking etc.)
Integrated Data Warehouse
Data warehouses at this stage are used to generate activity or
transactions that are passed back into the operational systems for use
in the daily activity of the organization.
The Data Warehouse Architecture
The data warehouse architecture consists of various interconnected
elements which are:
 1) Operational and external database layer: the source data for the
data warehouse.
2) Informational access layer: the tools, the end user access to extract
and analyze the data.
x
3) Data Access Layer: the interface between the operational and
Please Ask
informational access layer.
( कृपया पूछ )

47 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
4) Metadata Layer: The data directory or repository of metadata
information.
The concept of "data warehousing" dates back at least to the
mid-1980s, and possibly earlier. In essence, it was intended to provide
an architectural model for the flow of data from operational systems
to decision support environments. It attempted to address the various
problems associated with this flow, and the high costs associated with
it. In the absence of such an architecture, there usually existed an
enormous amount of redundancy in the delivery of management
information. In larger corporations it was typical for multiple decision
support projects to operate independently, each serving different
users but often requiring much of the same data. The process of
gathering, cleaning and integrating data from various sources, often
legacy systems, was typically replicated for each project. Moreover,
legacy systems were frequently being revisited as new requirements
emerged, each requiring a subtly different view of the legacy data.
Based on analogies with real-life warehouses, data warehouses were
intended as large-scale collection/storage/staging areas for corporate
data. From here data could be distributed to "retail stores" or "data
marts" which were tailored for access by decision support users (or
"consumers"). While the data warehouse was designed to manage the
bulk supply of data from its suppliers (e.g. operational systems), and
to handle the organization and storage of this data, the "retail stores"
or "data marts" could be focused on packaging and presenting
selections of the data to end-users, to meet specific management
information needs.
Somewhere along the way this analogy and architectural vision was
lost, as some vendors and industry speakers redefined the data
warehouse as simply a management reporting database. This is a
subtle but important deviation from the original vision of the data
warehouse as the hub of a management information architecture,
where the decision support systems were actually the data marts or
"retail stores".
Advantages
There are many advantages to using a data warehouse, some of them
are:
i. Data warehouses enhance end-user access to a wide
variety of data.
ii. Decision support system users can obtain specified
trend reports, e.g. the item with the most sales in a particular
area within the last two years.
iii. Data warehouses can be a significant enabler of
commercial business applications, particularly customer
relationship management (CRM) systems.
Limitations
a) Extracting, transforming and loading data consumes a lot of x
time and computational resources. Please Ask
( कृपया पूछ )

48 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
b) Data warehousing project scope must be actively managed
to deliver a release of defined content and value.
c) Compatibility problems with systems already in place.
d) Security could develop into a serious issue, especially if the
data warehouse is web accessible.
e) Data Storage design controversy warrants careful
consideration and perhaps prototyping of the data warehouse
solution for each project's environments.
DATA MINING
Overview
Generally, data mining (sometimes called data or knowledge
discovery) is the process of analyzing data from different perspectives
and summarizing it into useful information - information that can be
used to increase revenue, cuts costs, or both. Data mining software is
one of a number of analytical tools for analyzing data. It allows users
to analyze data from many different dimensions or angles, categorize
it, and summarize the relationships identified. Technically, data
mining is the process of finding correlations or patterns among dozens
of fields in large relational databases.
a) Data mining parameters include:
b) Association - looking for patterns where one event is
connected to another event
c) Sequence or path analysis - looking for patterns where one
event leads to another later event
d) Classification - looking for new patterns (May result in a
change in the way the data is organized but that's ok)
e) Clustering - finding and visually documenting groups of facts
not previously known
f) Forecasting - discovering patterns in data that can lead to
reasonable predictions about the future (This area of data
mining is known as predictive analytics.)
Data mining techniques are used in a many research areas, including
mathematics, cybernetics, and genetics. Web mining, a type of data
mining used in customer relationship management (CRM), takes
advantage of the huge amount of information gathered by a Web site
to look for patterns in user behavior. A data miner is a program that
collects such information, often without the user's knowledge, as
spyware.
Data mining is a class of database applications that look for hidden
patterns in a group of data that can be used to predict future
behavior. For example, data mining software can help retail
companies find customers with common interests. The term is
commonly misused to describe software that presents data in new
ways. True data mining software doesn't just change the presentation, x
but actually discovers previously unknown relationships among the Please Ask
data. ( कृपया पूछ )

49 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
Continuous Innovation
Although data mining is a relatively new term, the technology is not.
Companies have used powerful computers to sift through volumes of
supermarket scanner data and analyze market research reports for
years. However, continuous innovations in computer processing power,
disk storage, and statistical software are dramatically increasing the
accuracy of analysis while driving down the cost.
Example
For example, one Midwest grocery chain used the data mining
capacity of Oracle software to analyze local buying patterns. They
discovered that when men bought diapers on Thursdays and Saturdays,
they also tended to buy beer. Further analysis showed that these
shoppers typically did their weekly grocery shopping on Saturdays. On
Thursdays, however, they only bought a few items. The retailer
concluded that they purchased the beer to have it available for the
upcoming weekend. The grocery chain could use this newly discovered
information in various ways to increase revenue. For example, they
could move the beer display closer to the diaper display. And, they
could make sure beer and diapers were sold at full price on Thursdays.
Before taking data mining in detail lets understand some of these
basic terms again:
Data, Information, and Knowledge
Data
Data are any facts, numbers, or text that can be processed by a
computer. Today, organizations are accumulating vast and growing
amounts of data in different formats and different databases. This
includes:
• Operational or transactional data such as, sales, cost, inventory,
payroll, and accounting
• Non-operational data, such as industry sales, forecast data, and
macro economic data
• Meta data - data about the data itself, such as logical database
design or data dictionary definitions
Information
The patterns, associations, or relationships among all this data can
provide information. For example, analysis of retail point of sale
transaction data can yield information on which products are selling
and when.
Knowledge
Information can be converted into knowledge about historical patterns
and future trends. For example, summary information on retail
supermarket sales can be analyzed in light of promotional efforts to
provide knowledge of consumer buying behavior. Thus, a manufacturer
or retailer could determine which items are most susceptible to
x
promotional efforts.
Please Ask
Data Warehouses ( कृपया पूछ )

50 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
Dramatic advances in data capture, processing power, data
transmission, and storage capabilities are enabling organizations to
integrate their various databases into data warehouses. Data
warehousing is defined as a process of centralized data management
and retrieval. Data warehousing, like data mining, is a relatively new
term although the concept itself has been around for years. Data
warehousing represents an ideal vision of maintaining a central
repository of all organizational data. Centralization of data is needed
to maximize user access and analysis. Dramatic technological
advances are making this vision a reality for many companies. And,
equally dramatic advances in data analysis software are allowing users
to access this data freely. The data analysis software is what supports
data mining.
Application of Data mining
Data mining is primarily used today by companies with a strong
consumer focus - retail, financial, communication, and marketing
organizations. It enables these companies to determine relationships
among "internal" factors such as price, product positioning, or staff
skills, and "external" factors such as economic indicators, competition,
and customer demographics. And, it enables them to determine the
impact on sales, customer satisfaction, and corporate profits. Finally,
it enables them to "drill down" into summary information to view
detail transactional data.
With data mining, a retailer could use point-of-sale records of
customer purchases to send targeted promotions based on an
individual's purchase history. By mining demographic data from
comment or warranty cards, the retailer could develop products and
promotions to appeal to specific customer segments.
For example, Blockbuster Entertainment mines its video rental history
database to recommend rentals to individual customers. American
Express can suggest products to its cardholders based on analysis of
their monthly expenditures.
WalMart is pioneering massive data mining to transform its supplier
relationships. WalMart captures point-of-sale transactions from over
2,900 stores in 6 countries and continuously transmits this data to its
massive 7.5 terabyte Teradata data warehouse. WalMart allows more
than 3,500 suppliers, to access data on their products and perform
data analyses. These suppliers use this data to identify customer
buying patterns at the store display level. They use this information to
manage local store inventory and identify new merchandising
opportunities. In 1995, WalMart computers processed over 1 million
complex data queries.
The National Basketball Association (NBA) is exploring a data mining
application that can be used in conjunction with image recordings of
basketball games. The Advanced Scout software analyzes the
movements of players to help coaches orchestrate plays and
strategies. For example, an analysis of the play-by-play sheet of the
game played between the New York Knicks and the Cleveland x
Cavaliers on January 6, 1995 reveals that when Mark Price played the Please Ask
Guard position, John Williams attempted four jump shots and made ( कृपया पूछ )

51 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
each one! Advanced Scout not only finds this pattern, but explains
that it is interesting because it differs considerably from the average
shooting percentage of 49.30% for the Cavaliers during that game.
By using the NBA universal clock, a coach can automatically bring up
the video clips showing each of the jump shots attempted by Williams
with Price on the floor, without needing to comb through hours of
video footage. Those clips show a very successful pick-and-roll play in
which Price draws the Knick's defense and then finds Williams for an
open jump shot.
Process of data mining
While large-scale information technology has been evolving separate
transaction and analytical systems, data mining provides the link
between the two. Data mining software analyzes relationships and
patterns in stored transaction data based on open-ended user queries.
Several types of analytical software are available: statistical, machine
learning, and neural networks. Generally, any of four types of
relationships are sought:
Classes: Stored data is used to locate data in predetermined groups.
For example, a restaurant chain could mine customer purchase data
to determine when customers visit and what they typically order. This
information could be used to increase traffic by having daily specials.
Clusters: Data items are grouped according to logical relationships
or consumer preferences. For example, data can be mined to identify
market segments or consumer affinities.
Associations: Data can be mined to identify associations. The beer-
diaper example is an example of associative mining.
Sequential patterns: Data is mined to anticipate behavior
patterns and trends. For example, an outdoor equipment retailer
could predict the likelihood of a backpack being purchased based on a
consumer's purchase of sleeping bags and hiking shoes.
Data mining consists of five major elements:
1. Extract, transform, and load transaction data onto the data
warehouse system.
2. Store and manage the data in a multidimensional database
system.
3. Provide data access to business analysts and information
technology professionals.
4. Analyze the data by application software.
5. Present the data in a useful format, such as a graph or
table.
Different levels of analysis are available:
a) Artificial neural networks: Non-l     inear predictive
models that learn through training and resemble biological
neural networks in structure. x
Please Ask
b) Genetic algorithms: Optimization techniques that use
( कृपया पूछ )

52 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
processes such as genetic combination, mutation, and natural
selection in a design based on the concepts of natural
evolution.
c) Decision trees: Tree-shaped structures that represent
sets of decisions. These decisions generate rules for the
classification of a dataset. Specific decision tree methods
include Classification and Regression Trees (CART) and Chi
Square Automatic Interaction Detection (CHAID) . CART and
CHAID are decision tree techniques used for classification of a
dataset. They provide a set of rules that you can apply to a new
(unclassified) dataset to predict which records will have a given
outcome. CART segments a dataset by creating 2-way splits
while CHAID segments using chi square tests to create multi-
way splits. CART typically requires less data preparation than
CHAID.
d) Nearest neighbor method: A technique that classifies
each record in a dataset based on a combination of the classes
of the k record(s) most similar to it in a historical dataset
(where k 1). Sometimes called the k-nearest neighbor
technique.
e) Rule induction: The extraction of useful if-then rules
from data based on statistical significance.
f) Data visualization: The visual interpretation of complex
relationships in multidimensional data. Graphics tools are used
to illustrate data relationships.
Technological infrastructure required for Data Mining
Today, data mining applications are available on all size systems for
mainframe, client/server, and PC platforms. System prices range from
several thousand dollars for the smallest applications up to $1 million
a terabyte for the largest. Enterprise-wide applications generally
range in size from 10 gigabytes to over 11 terabytes. NCR has the
capacity to deliver applications exceeding 100 terabytes. There are
two critical technological drivers:
Size of the database: the more data being processed and
maintained, the more powerful the system required.
Query complexity: the more complex the queries and the greater
the number of queries being processed, the more powerful the system
required.
Relational database storage and management technology is adequate
for many data mining applications less than 50 gigabytes. However,
this infrastructure needs to be significantly enhanced to support
larger applications. Some vendors have added extensive indexing
capabilities to improve query performance. Others use new hardware
architectures such as Massively Parallel Processors (MPP) to achieve
order-of-magnitude improvements in query time. For example, MPP
systems from NCR link hundreds of high-speed Pentium processors to
x
achieve performance levels exceeding those of the largest
Please Ask
supercomputers.
( कृपया पूछ )

53 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
Applications of Data Mining
Data mining has been cited as the method by which the U.S. Army unit
Able Danger supposedly had identified the September 11, 2001 attacks
leader, Mohamed Atta, and three other 9/11 hijackers as possible
members of an al Qaeda cell operating in the U.S. more than a year
before the attack
Data Mining is most frequently used for Customer Relationship
Management applications. Common goals are to predict which people
are most likely to: a) Be Acquired b) Be Cross-Sold or Up-Sold c) Leave
\ Churn d) Be Retained, Saved, or Won back
These applications can contribute significantly to the bottom line.
Rather than contacting a prospect or customer through a call center
or sending mail, only prospects that are predicted to have a high
likelihood of responding to an offer are contacted.
More sophisticated methods may be used to optimize across
campaigns so that we can predict which channel and which offer an
individual is most likely to respond to - across all potential offers.
Finally, in cases where many people will take an action without an
offer, uplift modeling can be used to determine which people will
have the greatest increase in responding if given an offer.
Business employing data mining quickly see a return on investment,
but also they recognize that the number of predictive models can
quickly become very large. Rather than 1 model to predict which
customers will churn, we could build a separate model for each region
and customer type. Then instead of sending an offer to all people that
are likely to churn, we may only want to send offers to customers that
will likely take to offer. And finally, we may also want to determine
which customers are going to be profitable over a window of time and
only send the offers to those that are likely to be profitable. In order
to maintain this quantity of models, they need to 1) Manage model
versions 2) Move to "Automated Data Mining."
Another example of data mining, often called the Market Basket
Analysis, relates to its use in retail sales. If a clothing store records
the purchases of customers, a data mining system could identify those
customers who favour silk shirts over cotton ones. Although some
explanations of relationships may be difficult, taking advantage of it is
easier. The example deals with association rules within transaction-
based data. Not all data are transaction based and logical or inexact
rules may also be present within a database. In a manufacturing
application, an inexact rule may state that 73% of products which
have a specific defect or problem, will develop a secondary problem
within the next 6 months.
OLAP
OLAP stands for On-Line Analytical Processing. The first attempt to
provide a definition to OLAP was by Dr. Codd, who proposed 12 rules
for OLAP. Later, it was discovered that this particular white paper was
x
sponsored by one of the OLAP tool vendors, thus causing it to lose
objectivity. The OLAP Report has proposed the FASMI test, Fast Please Ask
( कृपया पूछ )

54 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
Analysis of Shared Multidimensional Information. For a more detailed
description of both Dr. Codd's rules and the FASMI test, please visit
The OLAP Report.
For people on the business side, the key feature out of the above list
is "Multidimensional." In other words, the ability to analyze metrics in
different dimensions such as time, geography, gender, product, etc.
For example, sales for the company is up. What region is most
responsible for this increase? Which store in this region is most
responsible for the increase? What particular product category or
categories contributed the most to the increase? Answering these
types of questions in order means that you are performing an OLAP
analysis.
Depending on the underlying technology used, OLAP can be braodly
divided into two different camps: MOLAP and ROLAP.
In the OLAP world, there are mainly two different types:
Multidimensional OLAP (MOLAP) and Relational OLAP (ROLAP). Hybrid
OLAP (HOLAP) refers to technologies that combine MOLAP and ROLAP.
MOLAP
This is the more traditional way of OLAP analysis. In MOLAP, data is
stored in a multidimensional cube. The storage is not in the relational
database, but in proprietary formats.
Advantages:
a) Excellent performance: MOLAP cubes are built for fast data
retrieval, and is optimal for slicing and dicing operations.
b) Can perform complex calculations: All calculations have
been pre-generated when the cube is created. Hence, complex
calculations are not only doable, but they return quickly.
Disadvantages:
a) Limited in the amount of data it can handle: Because all
calculations are performed when the cube is built, it is not
possible to include a large amount of data in the cube itself.
This is not to say that the data in the cube cannot be derived
from a large amount of data. Indeed, this is possible. But in this
case, only summary-level information will be included in the
cube itself.
b) Requires additional investment: Cube technology are often
proprietary and do not already exist in the organization.
Therefore, to adopt MOLAP technology, chances are additional
investments in human and capital resources are needed.

ROLAP
x
This methodology relies on manipulating the data stored in the
relational database to give the appearance of traditional OLAP's slicing Please Ask
( कृपया पूछ )

55 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
and dicing functionality. In essence, each action of slicing and dicing is
equivalent to adding a "WHERE" clause in the SQL statement.
Advantages:
a) Can handle large amounts of data: The data size limitation
of ROLAP technology is the limitation on data size of the
underlying relational database. In other words, ROLAP itself
places no limitation on data amount.
b) Can leverage functionalities inherent in the relational
database: Often, relational database already comes with a host
of functionalities. ROLAP technologies, since they sit on top of
the relational database, can therefore leverage these
functionalities.
Disadvantages:
a) Performance can be slow: Because each ROLAP report is
essentially a SQL query (or multiple SQL queries) in the
relational database, the query time can be long if the
underlying data size is large.
b) Limited by SQL functionalities: Because ROLAP technology
mainly relies on generating SQL statements to query the
relational database, and SQL statements do not fit all needs
(for example, it is difficult to perform complex calculations
using SQL), ROLAP technologies are therefore traditionally
limited by what SQL can do. ROLAP vendors have mitigated this
risk by building into the tool out-of-the-box complex functions
as well as the ability to allow users to define their own
functions.
HOLAP
HOLAP technologies attempt to combine the advantages of MOLAP and
ROLAP. For summary-type information, HOLAP leverages cube
technology for faster performance. When detail information is
needed, HOLAP can "drill through" from the cube into the underlying
relational data.
Questions / Answers
1. What is Business Process Reengineering? Explain the Role of
information technology & Impact of BPR on organizational
performance
2. List different Tools to support BPR & Benefits to Business
organization
3. Explain the Meaning of 'Management Information Systems (MIS) &
different Risks Associated With MIS
4. What is Decision Support System (DSS) and Explain few of its
applications
5. Explain the different Taxonomies of DSS
6. Explain the Architecture of DSS and Characteristics & Capabilities
of DSS
7. Explain the Meaning & scope of Executive Information System x
8. What are the Characteristics of Successful EIS Implementations? Please Ask
9. Compare Information Sharing vs Information Hoarding ( कृपया पूछ )

56 of 57 22-08-2018, 16:02
Study Material-1: UNIT 2 ERP AND RELATED TECHNOLOGIES https://sol.du.ac.in/mod/book/view.php?id=803&chapterid=449
10. Explain the EIS Design, Prototyping & Evaluation
11. What are the Advantages and disadvantages of EIS
12. Explain the meaning Data warehousing and its applications
13. Explain different Multi-dimensional Analysis Tools OLAP, MOLAP,
HOLAP with their advantages and disadvantages

x
Please Ask
( कृपया पूछ )

57 of 57 22-08-2018, 16:02

Você também pode gostar