Você está na página 1de 7

Measurement is the process observing and recording the observations that are collected as part of a research effort.

There are two major issues that will be considered here.


First, you have to understand the fundamental ideas involved in measuring. Here we consider two of major
measurement concepts. In Levels of Measurement, I explain the meaning of the four major levels of measurement:
nominal, ordinal, interval and ratio. Then we move on to the reliability of measurement, including consideration of
true score theory and a variety of reliability estimators.
Second, you have to understand the different types of measures that you might use in social research. We consider
four broad categories of measurements. Survey research includes the design and implementation of interviews and
questionnaires. Scaling involves consideration of the major methods of developing and implementing a scale.
Qualitative research provides an overview of the broad range of non-numerical measurement approaches. And
unobtrusive measures presents a variety of measurement methods that don't intrude on or interfere with the context
of the research.
http://www.socialresearchmethods.net/kb/measure.php

Writing SI units and symbols

This note explains how to write quantities and units of the Système international d'unités (SI), colloquially
known as the metric system. I catalog the power-of-ten prefixes, and I list some important units.

Write a numeric value with units in either the journalistic style, using prefix and unit names (four kilohertz);
or the scientific style, using prefix and unit symbols (4 kHz). Don't mix these styles: Do not mix a prefix name
with a unit symbol (WRONG: kiloHz), or a prefix symbol with a unit name (WRONG: kHertz). Avoid
"abbreviations" for units (WRONG: sec., amp); use the unit names or symbols instead.

If you are writing for an international audience, express values in the metric (SI) system used by the majority
of the world's population. If appropriate, follow an SI value with the equivalent Imperial value in
parentheses. Express the Imperial value with an accuracy comparable to the original: write 5 m (16 feet), not
5 m (16.4042 feet). Spell out inch, foot, pound and so on: Do not abbreviate to in, ft, and lb unless space is an
overriding concern. Do not use " and ' symbols for inch and foot: These symbols are easily lost in
reproduction, and they are unfamiliar to a large fraction of the world's population.
http://www.poynton.com/notes/units/index.html

The word "measurement" is derived from the Greek word "metron" which means a limited proportion.

Measuring devices

• Calendar (by counting days)


• Clock
• Egg timer
• Hourglass
• Pendulum clock
• Stopwatch
• Transit telescope
• Altimeter, height
• Architect's scale
• Electronic distance meter
• Engineer's scale
• GPS
• Rule
• Surveyor's wheel
• Tape measure
• Taximeter
• Measuring cup
• Gas meter
• Metering pump
• Water meter
• Weighing scales
• Spring scale
• Ballistic pendulum
• Anemometer ,wind speed
• Barometer , atmospheric pressure.
• Protractor

http://en.wikipedia.org/wiki/Measuring_instrument

The grain was the earliest unit of mass and is the smallest unit in the apothecary, avoirdupois, Tower, and
troy systems. The early unit was a grain of wheat or barleycorn used to weigh the precious metals silver
and gold. Larger units preserved in stone standards were developed that were used as both units of mass
and of monetary currency. The pound was derived from the mina used by ancient civilizations. A smaller
unit was the shekel, and a larger unit was the talent. The magnitude of these units varied from place to
place. The Babylonians and Sumerians had a system in which there were 60 shekels in a mina and 60
minas in a talent. The Roman talent consisted of 100 libra (pound) which were smaller in magnitude than
the mina. The troy pound (~373.2 g) used in England and the United States for monetary purposes, like
the Roman pound, was divided into 12 ounces, but the Roman uncia (ounce) was smaller. The carat is a
unit for measuring gemstones that had its origin in the carob seed, which later was standardized at 1/144
ounce and then 0.2 gram.

Units of measurement were among the earliest tools invented by humans. Primitive societies needed
rudimentary measures for many tasks: constructing dwellings of an appropriate size and shape, fashioning
clothing, or bartering food or raw materials.

HISTORY OF MEASUREMENT
Length
Length is the most necessary measurement in everyday life, and units of
length in many countries still reflect humanity's first elementary methods.

The inch is a thumb. The foot speaks for itself. The yard relates closely to a
human pace, but also derives from two cubits (the measure of the forearm).
The mile is in origin the Roman mille passus - a 'thousand paces',
approximating to a mile because the Romans define a pace as two steps,
bringing the walker back to the same foot. With measurements such as these,
it is easy to explain how far away the next village is and to work out whether
an object will get through a doorway.

For the complex measuring problems of civilization - surveying land to


register property rights, or selling a commodity by length - a more precise
unit is required.

The solution is a rod or bar, of an exact length, kept in a central public place.
From this 'standard' other identical rods can be copied and distributed
through the community. In Egypt and Mesopotamia these standards are kept
in temples. The basic unit of length in both civilizations is the cubit, based on
a forearm measured from elbow to tip of middle finger. When a length such
as this is standardized, it is usually the king's dimension which is first taken
as the norm.

Weight
For measurements of weight, the human body provides no such easy
approximations as for length. But nature steps in. Grains of wheat are
reasonably standard in size. Weight can be expressed with some degree of
accuracy in terms of a number of grains - a measure still used by jewellers.

As with measurements of length, a lump of metal can be kept in the temples


as an official standard for a given number of grains. Copies of this can be
cast and weighed in the balance for perfect accuracy. But it is easier to
deceive a customer about weight, and metal can all too easily be removed to
distort the scales. An inspectorate of weights and measures is from the start a
practical necessity, and has remained so.

Volume
Among the requirements of traders or tax collectors, a reliable standard of
volume is the hardest to achieve. Nature provides some very rough averages,
such as goatskins. Baskets, sacks or pottery jars can be made to
approximately consistent sizes, sufficient perhaps for many everyday
transactions.

But where the exact amount of any commodity needs to be known, weight is
the measure more likely to be relied upon than volume.

Time
Time, a central theme in modern life, has for most of human history been
thought of in very imprecise terms.

The day and the week are easily recognized and recorded - though an
accurate calendar for the year is hard to achieve. The forenoon is easily
distinguishable from the afternoon, provided the sun is shining, and the
position of the sun in the landscape can reveal roughly how much of the day
has passed. By contrast the smaller parcels of time - hours, minutes and
seconds - have until recent centuries been both unmeasurable and unneeded.

Sundial and water clock: from the 2nd millennium BC


The movement of the sun through the sky makes possible a simple estimate
of time, from the length and position of a shadow cast by a vertical stick. (It
also makes possible more elaborate calculations, as in the attempt of
Erathosthenes to measure the world). If marks are made where the sun's
shadow falls, the time of day can be recorded in a consistent manner.

The result is the sundial. An Egyptian example survives from about 800 BC,
but the principle is certainly familiar to astronomers very much earlier.
However it is difficult to measure time precisely on a sundial, because the
sun's path throug the sky changes with the seasons. Early attempts at
precision in time-keeping rely on a different principle.

The water clock, known from a Greek word as the clepsydra, attempts to
measure time by the amount of water which drips from a tank. This would be
a reliable form of clock if the flow of water could be perfectly controlled. In
practice it cannot. The clepsydra has an honourable history from perhaps
1400 BC in Egypt, through Greece and Rome and the Arab civlizations and
China, and even up to the 16th century in Europe. But it is more of a toy than
a timepiece.

The hourglass, using sand on the same principle, has an even longer career. It
is a standard feature on 18th-century pulpits in Britain, ensuring a sermon of
sufficient length. In a reduced form it can still be found timing an egg.

Hero's dioptra: 1st century AD


One of the surviving books of Hero of Alexandria, entitled On the Dioptra,
describes a sophisticated technique which he has developed for the surveying
of land. Plotting the relative position of features in a landscape, essential for
any accurate map, is a more complex task than simply measuring distances.

It is necessary to discover accurate angles in both the horizontal and vertical


planes. To make this possible a surveying instrument must somehow
maintain both planes consistently in different places, so as to take readings of
the deviation in each plane between one location and another.

This is what Hero achieves with the instrument mentioned in his title, the
dioptra - meaning, approximately, the 'spyhole' through which the surveyor
looks when pinpointing the target in order to read the angles.

Hero adapts, for this new and dificult task, an instrument long used by Greek
astronomers (such as Hipparchus) for measuring the angle of stars in the sky.
It is evident from his description that the dioptra differs from the modern
theodolite in only two important respects. It lacks the added convenience of
two inventions not available to Hero - the compass and the telescope.
The hour: 14th century AD
Until the arrival of clockwork, in the 14th century AD, an hour is a variable
concept. It is a practical division of the day into 12 segments (12 being the
most convenient number for dividing into fractions, since it is divisible by 2,
3 and 4). For the same reason 60, divisble by 2, 3, 4 and 5, has been a larger
framework of measurement ever since Babylonian times.

The traditional concept of the hour, as one twelfth of the time between dawn
and dusk, is useful in terms of everyday timekeeping. Approximate
appointments are easily made, at times which are easily sensed. Noon is
always the sixth hour. Half way through the afternoon is the ninth hour -
famous to Christians as the time of the death of Jesus on the Cross.

The trouble with the traditional hour is that it differs in length from day to
day. And a daytime hour is different from one in the night (also divided into
twelve equal hours). A clock cannot reflect this variation, but it can offer
something more useful. It can provide every day something which occurs
naturally only twice a year, at the spring and autumn equinox, when the 12
hours of day and the 12 hours of night are the same length.

In the 14th century, coinciding with the first practical clocks, the meaning of
an hour gradually changes. It becomes a specific amount of time, one
twenty-fourth of a full solar cycle from dawn to dawn. And the day is now
thought of as 24 hours, though it still features on clock faces as two twelves.

Minutes and seconds: 14th - 16th century AD


Even the first clocks can measure periods less than an hour, but soon striking
the quarter-hours seems insufficient. With the arrival of dials for the faces of
clocks, in the 14th century, something like a minute is required. The Middle
Ages, by a tortuous route from Babylon, inherit a scale of scientific
measurement based on 60. In medieval Latin the unit of one sixtieth is pars
minuta prima ('first very small part'), and a sixtieth of that is pars minute
secunda ('second very small part'). Thus, on a principle 3000 years old,
minutes and seconds find their way into time.

Minutes are mentioned from the 14th century, but clocks are not precise
enough for anyone to bother about seconds until two centuries later.

Barometer and atmospheric pressure: AD 1643-1646


Like many significant discoveries, the principle of the barometer is observed
by accident. Evangelista Torricelli, assistant to Galileo at the end of his life,
is interested in why it is more difficult to pump water from a well in which
the water lies far below ground level. He suspects that the reason may be the
weight of the extra column of air above the water, and he devises a way of
testing this theory.

He fills a glass tube, open at only one end, with mercury. Submerging the
open end in a bath of mercury, and raising the tube to a vertical position, he
finds that the mercury slips a little way down the tube. He reasons that the
weight of air on the mercury in the bath is supporting the weight of the
column of mercury in the tube.

If this is true, then the space in the glass tube above the mercury column
must be a vacuum. This plunges him into instant controversy with
traditionalists, wedded to the ancient theory - going as far back as Aristotle -
that 'nature abhors a vacuum'. But it also encourages von Guericke, in the
next decade, to develop the vacuum pump.

The concept of variable atmospheric pressure occurs to Torricelli when he


notices, in 1643, that the height of his column of mercury sometimes varies
slightly from its normal level, which is 760 mm above the mercury level in
the bath. Observation suggests that these variations relate closely to changes
in the weather. The barometer is born.

With the weight of air thus established, Torricelli is able to predict that there
must be less atmospheric pressure at higher altitudes. It is not hard to
imagine an experiment which would test this, but the fame for proving the
point in 1646 attaches to Blaise Pascal - though it is not even he who carries
out the research.

Having a weak constitution, Pascal persuades his more robust brother-in-law


to carry a barometer to different levels of the 4000-foot Puy de Dôme, near
Clermont, and to take readings. The brother-in-law descends from the
mountain with the welcome news that the readings were indeed different.
Atmospheric pressure varies with altitude.

Mercury thermometer: AD 1714-1742


Gabriel Daniel Fahrenheit, a German glass-blower and instrument-maker
working in Holland, is interested in improving the design of thermometer
which has been in use for half a century. Known as the Florentine
thermometer, because developed in the 1650s in Florence's Accademia del
Cimento, this pioneering instrument depends on the expansion and
contraction of alcohol within a glass tube.

Alcohol expands rapidly with a rise in temperature, but not at an entirely


regular speed of expansion. This makes accurate readings difficult, as also
does the sheer technical problem of blowing glass tubes with very narrow
and entirely consistent bores.

By 1714 Fahrenheit has made great progress on the technical front, creating
two separate alcohol thermometers which agree precisely in their reading of
temperature. In that year he hears of the researches of a French physicist,
Guillaume Amontons, into the thermal properties of mercury.

Mercury expands less than alcohol (about seven times less for the same rise
in temperature), but it does so in a more regular manner. Fahrenheit sees the
advantage of this regularity, and he has the glass-making skills to
accomodate the smaller rate of expansion. He constructs the first mercury
thermometer, of a kind which subsequently becomes standard.

There remains the problem of how to calibrate the thermometer to show


degrees of temperature. The only practical method is to choose two
temperatures which can be independently established, mark them on the
thermometer and divide the intervening length of tube into a number of equal
degrees.

In 1701 Newton has proposed the freezing point of water for the bottom of
the scale and the temperature of the human body for the top end. Fahrenheit,
accustomed to Holland's cold winters, wants to include temperatures below
the freezing point of water. He therefore accepts blood temperature for the
top of his scale but adopts the freezing point of salt water for the lower
extreme.

Measurement is conventionally done in multiples of 2, 3 and 4, so Fahrenheit


splits his scale into 12 sections, each of them divided into 8 equal parts. This
gives him a total of 96 degrees, zero being the freezing point of brine and 96°
(in his somewhat inaccurate reading) the average temperature of human
blood. With his thermometer calibrated on these two points, Fahrenheit can
take a reading for the freezing point (32°) and boiling point (212°) of water.

A more logical Swede, Anders Celsius, proposes in 1742 an early example of


decimilization. His centigrade scale takes the freezing and boiling
temperatures of water as 0° and 100°. In English-speaking countries this less
complicated system takes more than two centuries to prevail.

Read more: http://www.historyworld.net/wrldhis/PlainTextHistories.asp?


historyid=ac07##ixzz0T3waBtWZ

Você também pode gostar