Escolar Documentos
Profissional Documentos
Cultura Documentos
Data processing is any computer process that converts data into information.
The processing is usually assumed to be automated and running on a mainframe,
minicomputer, microcomputer, or personal computer. Because data are most useful when
well-presented and actually informative, dataprocessing systems are often referred to as
information systems to emphasize their practicality. Nevertheless, both terms are roughly
synonymous, performing similar conversions; data-processing systems typically
manipulate raw data into information, and likewise information systems typically take
raw data as input to produce information as output. To better market their profession, a
computer programmer or a systems analyst that might once have referred, such as during
the 1970s, to the computer systems that they produce as data-processing systems more
often than not nowadays refers to the computer systems that they produce by someother
term that includes the word information, such as information systems, information
technology systems, or management information systems.
In the context of data processing, data are defined as numbers or characters that
represent measurements from the real world. A single datum is a single measurement
from the real world. Measured information is then algorithmically derived and/or
logically deduced and/or statistically calculated from multiple data. Information is
defined as either a meaningful answer to a query or a meaningful stimulus that can
cascade into further queries.
More generally, the term data processing can apply to any process that converts
data from one format to another, although data conversion would be the more logical and
correct term. From this perspective, data processing becomes the process of converting
information into data and also the converting of data back into information. The
distinction is that conversion doesn't require a question (query) to be answered. For
example, information in the form of a string of characters forming a sentence in English
is converted or encoded from a keyboard's key-presses as represented by
hardwareoriented integer codes into ASCII integer codes after which it may be more
easily processed by a computernot as merely raw, amorphous integer data, but as a
meaningful character in a natural language's set of graphemesand finally converted or
decoded to be displayed as characters, represented by a font on the computer display. In
that example we can see the stage-by-stage conversion of the presence of and then
absence of electrical conductivity in the key-press and subsequent release at the keyboard
from raw substantiallymeaningless integer hardware-oriented data to evermore-
meaningful information as the processing proceeds toward the human being.
A more conventional example of the established practice of using the term data
processing is that a business has collected numerous data concerning an aspect of its
operations and that this multitude of data must be presented in meaningful, easy-to-access
presentations for the managers who must then use that information to increase revenue or
to decrease cost. That conversionand presentation of data as information is typically
performed by a dataprocessing application.
Objectives
After going through this lesson, you will be in a position to l define the
concepts of data, information and data processing l explain various data processing
activities l utilise data processing cycle l explain data elements, records, files and
databases.
Data
The word "data" is the plural of datum, which means fact, observation,
assumption or occurrence. More precisely, data are representations of facts pertaining to
people, things, ideas and events. Data are represented by symbols such as letters of the
alphabets, numerals or other special symbols.
Data Processing
Information
Information, thus can be defined as data that has been transformed into a
meaningful and useful form for specific purposes. In some cases data may not require
any processing before constituting information. However, generally, data is not useful
unless it is subjected to a process through which it is manipulated and organised, its
contents analyzed and evaluated. Only then data becomes information.
HISTORY
Although widespread use of the term data processing dates only from the
nineteen-fifties,[3] data processing functions have been performed manually for millennia.
For example, bookkeeping involves functions such as posting transactions and producing
reports like the balance sheet and the cash flow statement. Completely manual methods
were augmented by the application of mechanical or electronic calculators. A person
whose job was to perform calculations manually or using a calculator was called a
"computer."
The 1850 United States Census schedule was the first to gather data by
individual rather than household. A number of questions could be answered by making a
check in the appropriate box on the form. From 1850 through 1880 the Census Bureau
employed "a system of tallying, which, by reason of the increasing number of
combinations of classifications required, became increasingly complex. Only a limited
number of combinations could be recorded in one tally, so it was necessary to handle the
schedules 5 or 6 times, for as many independent tallies. It took over 7 years to publish
the results of the 1880 census using manual processing methods.
Other developments
The term data processing has mostly been subsumed by the newer and
somewhat more general term information technology (IT).[citation needed]
The term "data
processing" is presently considered sometimes to have a negative connotation, suggesting
use of older technologies. As an example, during 1996 the Data Processing Management
Association (DPMA) changed its name to the Association of Information Technology
Professionals. Nevertheless, the terms are approximately synonymous.
DATA PROCESSING ACTIVITIES
Regardless to the type of equipment used, various functions and activities which need to
be performed for data processing can be grouped under five basic categories.
COLLECTION
Data originates in the form of events transaction or some observations This data
is then recorded in some usable form. Data may be initially recorded on paper source
documents 2.2 and then converted into a machine usable form for processing.
Alternatively, they may be recorded by a direct input device in a paperless, machine-
readable form. Data collection is also termed as data capture.
CONVERSION
Once the data is collected, it is converted from its source documents to a form
that is more suitable for processing. The data is first codified by assigning identification
codes. A code comprises of numbers, letters, special characters, or a combination of
these. For example, an employee may be allotted a code as 52-53-162, his category as A
class, etc. It is useful to codify data, when data requires classification. To classify means
to categorize, i.e., data with similar characteristics are placed in similar categories or
groups. For example, one may like to arrange accounts data according to account number
or date. Hence a balance sheet can easily be prepared.
After verification, the data is transcribed from one data medium to another. For
example, in case data processing is done using a computer, the data may be transformed
from source documents to machine sensible form using magnetic tape or a disk.
MANIPULATION
Once data is collected and converted, it is ready for the manipulation function
which converts data into information. Manipulation consists of following activities:
Sorting
Calculating
Summarizing
Comparing
Once data has been captured and manipulated following activities may be
carried out :
Storing
To store is to hold data for continued or later use. Storage is essential for any
organised method of processing and re-using data. The storage mechanisms for data
processing systems are file cabinets in a manual system, and electronic devices such as
magnetic disks/magnetic tapes in case of computer based system. The storing activity
involves storing data and information in organised manner in order to facilitate the
retrieval activity. Of course, data should be stored only if the value of having them in
future exceeds the storage cost.
Retrieving
COMMUNICATION
1. Collection
Collection is the first stage of the cycle, and is very crucial, since the
quality of data collected will impact heavily on the output. The collection process needs
to ensure that the data gathered are both defined and accurate, so that subsequent
decisions based on the findings are valid. This stage provides both the baseline from
which to measure, and a target on what to improve.
2. Preparation
3. Input
Input is the task where verified data is coded or converted into machine
readable form so that it can be processed through a computer. Data entry is done through
the use of a keyboard, digitizer, scanner, or data entry from an existing source. This time-
consuming process requires speed and accuracy. Most data need to follow a formal and
strict syntax since a great deal of processing power is required to breakdown the complex
data at this stage. Due to the costs, many businesses are resorting to outsource this stage.
4. Processing
6. Storage
Storage is the last stage in the data processing cycle, where data,
instruction and information are held for future use. The importance of this cycle is that it
allows quick access and retrieval of the processed information, allowing it to be passed
on to the next stage directly, when needed. Every computer uses storage to hold system
and application software.
Data processing is any operation performed upon data. For example: running
the the list of the participants of the meeting, recordings of data collected during entrance
to and exits from the premises, collecting information on contractor company directors or
persons having insurance, recording the customer contact information by shops, etc.
While processing the personal data the following principles shall be applied
Fair and Lawful Processing - Personal data shall be processed fairly and lawfully
without violating the dignity of data subjects;
Accuracy of the Data Personal data must be accurate and kept up to date. Data
controller shall take necessary measures to ensure the data accuracy; credibility of
the source from which data is obtained shall be verified, inaccurate and invalid
data shall be erased;
Storage of the Data - Personal data may only be stored for as long as necessary to
achieve the purpose for which they were collected or further processed.
The personal data can only be processed if one of the following grounds exists:
Consent of the Data Subject data might be processed in case of the data subject
has given unambiguous, freely given, explicit and informed consent on processing
of his/her personal data for specific purposes and to a specific extent;
Legal Obligations when the data processing is necessary for the purposes of
fulfilling the obligations of the data controller prescribed by the legislation;
Vital Interest data processing is necessary to protect the vital interests of the
data subject;
Protection of the Legal Interest The data might be processed for protection of
the legal interest of the data controller or the third party except were such interest
is overridden by the interest of the protection of the fundamental rights and
freedoms of the data subject;
The publicity of the data covers the situations where publicity of the data is
prescribed by law or if the data subject makes them publicly available;
Handling with application covers situations when the data is processed for the
purposes of providing service to the data subject or dealing with his/her
application.
Processing of the Sensitive Data
It is prohibited to process the sensitive personal data except the following cases:
The Written Consent of the Data Subject the consent shall be explicitly
provided for in the document and the free will of the data subject to process
his/her specific data for specific purposes shall be specified;
Publicity of the Data if the data subject has made data public without evidently
or explicitly restricting their use;
Vital Interest if the processing is necessary to protect the vital interests of the
data subject or of another person where the data subject is physically or legally
incapable of giving his or her consent.
INTRODUCTION
Amazon has separate retail websites for United States, United Kingdom &
Ireland, France, Canada, Germany, The Netherlands, Italy, Spain, Australia, Brazil, Japan,
China, India and Mexico. Amazon also offers international shipping to certain other
countries for some of its products. In 2017, it had professed an intention to launch its
websites in Poland and Sweden.
HISTORY
The company was founded in 1994, spurred by what Bezos called his "regret
minimization framework", which described his efforts to fend off any regrets for not
participating sooner in the Internet business boom during that time. In 1994, Bezos left
his employment as vice-president of D. E. Shaw & Co., a Wall Street firm, and moved to
Seattle. He began to work on a business plan for what would eventually become
Amazon.com.
Jeff Bezos incorporated the company as "Cadabra" on July 5, 1994 and the site
went online as Amazon.com in 1995. Bezos changed the name cadabra.com to
amazon.com because it sounded too much like cadaver. Additionally, a name beginning
with "A" was preferential due to the probability it would occur at the top of any list that
was alphabetized.
Bezos selected the name Amazon by looking through the dictionary, and settled
on "Amazon" because it was a place that was "exotic and different" just as he planned for
his store to be; the Amazon river, he noted was by far the "biggest" river in the world, and
he planned to make his store the biggest in the world. Bezos placed a premium on his
head start in building a brand, telling a reporter, "There's nothing about our model that
can't be copied over time. But you know, McDonald's got copied. And it still built a huge,
multibillion-dollar company. A lot of it comes down to the brand name. Brand names are
more important online than they are in the physical world."
After reading a report about the future of the Internet which projected annual
Web commerce growth at 2,300%, Bezos created a list of 20 products which could be
marketed online. He narrowed the list to what he felt were the five most promising
products which included: compact discs, computer hardware, computer software, videos,
and books. Bezos finally decided that his new business would sell books online, due to
the large world-wide demand for literature, the low price points for books, along with the
huge number of titles available in print. Amazon was originally founded in Bezos' garage
in Bellevue, Washington.
Since June 19, 2000, Amazon's logotype has featured a curved arrow leading
from A to Z, representing that the company carries every product from A to Z, with the
arrow shaped like a smile.
Amazon's initial business plan was unusual; it did not expect to make a profit
for four to five years. This "slow" growth caused stockholders to complain about the
company not reaching profitability fast enough to justify investing in, or to even survive
in the long-term. When the dot-com bubble burst at the start of the 21st century,
destroying many e-companies in the process, Amazon survived, and grew on past the
bubble burst to become a huge player in online sales. It finally turned its first profit in the
fourth quarter of 2001: $5 million (i.e., 1 per share), on revenues of more than $1
billion. This profit margin, though extremely modest, proved to skeptics that Bezos'
unconventional model could succeed. In 1999, Time magazine named Bezos the Person
of the Year, recognizing the company's success in popularizing online shopping.
Barnes & Noble sued Amazon on May 12, 1997, alleging that Amazon's
claim to be "the world's largest bookstore" was false. Barnes and Noble asserted, "[It]
isn't a bookstore at all. It's a book broker." The suit was later settled out of court, and
Amazon continued to make the same claim. Wal-Mart sued Amazon on October 16,
1998, alleging that Amazon had stolen Wal-Marts trade secrets by hiring former Wal-
Mart executives. Although this suit was also settled out of court, it caused Amazon to
implement internal restrictions and the reassignment of the former Wal-Mart executives.
COMPANY VISION & MISSION
We have provided below the content of the Amazon Vision Statement which
details their outlook of the future. Effective and successful statements are powerful and
compelling, conveying confidence and inspiring views of the future. The importance of
these types of statements should not be underestimated. One good paragraph will
describe the values, services and vision for the future. Whether you are looking to
compose a personal or a company vision statement our samples and this example of the
Amazon vision statement will provide you with some excellent ideas and inspiration.
Our vision is to be earth's most customer centric company; to build a place where
people can come to find and discover anything they might want to buy online.
Amazon.com has had a clear focus and a solitary mission since it began. Founder Jeff
Bezos has publicly referred to the Amazon.com mission statement as the guiding force
behind his leadership decisions many times in the company's 18-year history. It can be
concluded that the success of Amazon.com as the top Internet retailing company in the
world is due at least in part to their unwavering commitment to this mission and the daily
execution of it. The mission and vision of Amazon.com is...
Our vision is to be earth's most customer centric company; to build a place where people
can come to find and discover anything they might want to buy online.
STAGES OF DATA PROCESSING IN AMAZON.COM
Amazon EMR securely and reliably handles a broad set of big data use cases,
including log analysis, web indexing, data transformations (ETL), machine learning,
financial analysis, scientific simulation, and bioinformatics.
Athena is easy to use. Simply point to your data in Amazon S3, define the
schema, and start querying using standard SQL. Most results are delivered within
seconds. With Athena, theres no need for complex ETL jobs to prepare your data for
analysis. This makes it easy for anyone with SQL skills to quickly analyze large-scale
datasets.
With Amazon Athena, you pay only for the queries that you run. You are charged
$5 per terabyte scanned by your queries. You can save from 30% to 90% on your per-
query costs and get better performance by compressing, partitioning, and converting your
data into columnar formats. Athena queries data directly in Amazon S3. There are no
additional storage charges beyond S3.
With Amazon Kinesis Firehose, you only pay for the amount of data you
transmit through the service. There is no minimum fee or setup cost.
Object Storage
Amazon Simple Storage Service (Amazon S3) is object
storage with a simple web service interface to store and retrieve any
amount of data from anywhere on the web. It is designed to deliver
99.999999999% durability, and scale past trillions of objects
worldwide.
Graph Databases
Amazon Aurora
NoSQL
Amazon DynamoDB is a fast and flexible NoSQL database
service for all applications that need consistent, single-digit millisecond
latency at any scale. It is a fully managed cloud database and supports
both document and key-value store models. Its flexible data model and
reliable performance make it a great fit for mobile, web, gaming, ad
tech, IoT, and many other applications. Start today by downloading the
local version of DynamoDB, then read our Getting Started Guide.
Relational Databases
DATA WAREHOUSING
Amazon Redshift
AMAZON EC2
DATA MOVEMENT
Direct Connectivity
Database Migration
CONCLUSION
As more and more data is generated and collected, data process requires scalable,
flexible, and high performing tools to provide insights in a timely fashion. However,
organizations are facing a growing big data ecosystem where new tools emerge and die
very quickly. Therefore, it can be very difficult to keep pace and choose the right tools.
This whitepaper offers a first step to help you solve this challenge. With a broad
set of managed services to collect, process, and process data the AWS platform makes it
easier to build, deploy and scale big data applications, allowing you to focus on business
problems instead of updating and managing these tools.
AWS provides many solutions to address big data process requirements. Most
big data architecture solutions use multiple AWS tools to build a complete solution: this
can help meet the stringent business requirements in the most cost-optimized, performant,
and resilient way possible. The result is a flexible, big data architecture that is able to
scale along with your business on the AWS global infrastructure.
WEBLOGRAPHY
http://aws.amazon.com/solutions/case-studies/big-data/
https://aws.amazon.com/kinesis/streams
http://docs.aws.amazon.com/kinesis/latest/APIReference/Welcome.html
http://docs.aws.amazon.com/aws-sdk-php/v2/guide/amazon-service-kinesis
https://personaldata.ge/en/for-public-bodies/monatsemta-damushavebis-
printsipebi-da-safudzvlebi
www.google.com
www.google.co/data/processing-amazon.co.in