Você está na página 1de 5

COATING & PAINTING INSPECTION

AMBIENT CONDITION
Before initiating surface preparation or coating operations, the temperatures (air and surface),
dew point, relative humidity, and wind velocity must be checked to ensure that they conform to
specification requirements. SSPC-PA 1 provides information on proper conditions for shop and
field painting. Since ambient and steel temperatures may change quickly, they should be
measured periodically throughout the day. ASTM E 337 dictates that the ambient condition test
or environmental test should be done: before, during, and after the application and they must
be monitored at least every four hour interval, even more when the condition are unstable.

TEMPERATURE
The application of a coating system shall occur only when the air & substrate temperature is
within the range indicated by the manufacturers written instructions for both application and
curing. A rule of thumb, no work shall be done when air temperature below 50C and surface
temperature less than 30C above dew point temperature.

DEWPOINT
Dewpoint is defined as the temperature at which moisture will condense. Dew point is important
in coating work because moisture condensation on the steel surface will cause freshly blast
cleaned steel to rust, or a thin, often invisible film of moisture trapped between coats may cause
premature coating failure.

RELATIVE HUMIDITY
Due to curing of coatings may be adversely affected by humidity that are too low or too high, no
coating shall be applied unless the supplier or manufacturers written technical requirements for
humidity are met. High humidity may cause moisture to condense on or react with uncured
coating films to cause blushing or other adverse effects. However, for certain inorganic zinc and
one-package, moisture-curing polyurethane coating, require a minimum humidity for curing, but
for most organic coatings, the rule of thumb, no work shall be carried when relative humidity
above 85%.

WIND VELOCITY
For field or open air application, wind velocity may blow airborne contaminants to work surfaces
and coating materials. It also contributes to dry spray, dusty-spotted effects to the coated surface
and accelerates solvent evaporation time which may cause immature drying. No work shall be
done in the open air field when the wind velocity above 24 km/hour.

AMBIENT TEST INSTRUMENTS
1. Surface Magnetic Thermometer is used to measure steel substrate temperature. Must be
allowed to stabilize on surface to be measured for at least 5 minutes. Must be used at
actual location, avoid direct sunlight, and must be calibrated often.
2. Sling Psychrometer is used to measure wet and dry temperatures. These information are
then used to calculate dewpoint and relative humidity (some latest instrument has
dewpoint and relative humidity scales).
3. Dewpoint calculator is used to calculate dewpoint temperature and relative humidity.
Prior to use this instrument, data must be first obtained from the Sling Psychrometer.
4. Anemometer is used to measure wind velocity.
PRE-SURFACE PREP. INSPECTION
Before the start of surface preparation for coating, all necessary construction or modification of
items requiring coating should have been completed. This includes grinding of welds and sharp
edges and filling of pits. Likewise, the surface must be free from all contaminants. Also, the job
site must then be inspected for complete readiness (i.e., all required operational and support
equipment is present, and access for inspection of work is available). This includes safety aspects
such as ladders and scaffolding, power, and traffic control, so that the inspector can safely
perform his duties.

ABRASIVE CHECK
All new mineral and slag abrasives must be inspected for physical and chemical properties as
described in SSPC AB 1. Recycled ferrous metal abrasives must be checked for cleanliness and
fines as described in SSPC AB 2. The abrasives should be properly labeled for identification.
Even if a sieve analysis (ASTM C 136) is provided by the supplier, it is prudent to run a check at
the job site or retain a sample for later analysis should cleaning rates be lower or profile heights
other than anticipated.
A simple test can be conducted for contaminants or fines in the abrasive. A spoonful of abrasive
is placed in a vial of distilled water and shaken vigorously. It is then checked for:
Oil or grease that forms a surface sheen
Fines suspended in or at the surface of the water
Color or turbidity from dirt
Soluble salts by conductivity or deposition upon evaporation
Acidity or alkalinity with pH paper

BLASTING EQUIPMENT CHECK
All air compressors and blasting equipment should be checked for proper size, cleanliness,
operation, and safety. Hand or power tools should also be checked for operation and safety, and
should be used only as specified in their standard operating procedures. These checks should be
made before the start of abrasive blasting and periodically thereafter, especially after a change of
abrasive. Air and blast hoses should be checked for damage and constrictions and should be as
short and of as large a diameter as practical to reduce frictional losses of air pressure. The blast
hose should have a static grounding system. Couplings should be of the external fit type, secured
well, and safety-wired.
Blast nozzles should be of the venturi type, with a flared exit to allow more rapid and uniform
cleaning. An orifice gauge should be used to check the nozzle size (inches) and air flow (cfm at
100 psi). This wedge-shaped instrument or bore-nozzle inserted into the rear of the nozzle has a
measuring range of 1/4 to 5/8 inch and an air flow range of 81 to 548 cfm. Nozzles should be
discarded after an increase of one size (e.g., 1/16 inch is the difference between a #6 and a #7
nozzle). All nozzles must have a deadman control that will automatically shut off the flow of air
and abrasive when released.
The compressed air used in abrasive blasting must be checked to determine whether oil and
water traps have completely removed contaminants. This is done by the blotter test described in
ASTM D 4285. A clean, dry, white blotter or cloth is held about 18 inches (450 mm) in front of
the blast nozzle with the air flowing for one to two minutes. Oil and water contaminants are
detected visually on the blotter or cloth surface.
Abrasive blasting is usually done at pressures between 90 and 100 psi for efficient blasting.
Higher blasting pressures may produce even higher blasting rates. A pocket-sized air pressure
gauge with a hypodermic needle can be used for determining cleaning pressure at the nozzle. The
gauge is inserted in the blasting hose just before the nozzle in the direction of the flow. Instant
readings can be made up to 160 psi.

POST-SURFACE PREP. INSPECTION
Steel surface cleanliness requirements for abrasive blast cleaned steel (i.e., SSPC levels of
surface preparation) can readily be determined using SSPC-VIS 1 photographic standards. SSPC
surface preparation standards define cleanliness in terms of visible contaminants such as rust,
mill scale, paint, and staining.
Two commonly used methods for determining the profile (average peak-to-valley depth) of
blasted steel surfaces are described in ASTM D 4417. The Testex Press-O-Film Replica Tape
method is preferred, because it is easy to conduct, accurate, and produces a permanent record.
The tape consists of a layer of deformable plastic foam bonded to a Mylar backing. The tape is
rubbed onto the blast-cleaned surface with a plastic swizzle stick to produce a reverse replicate
of the profile. The tape profile is then measured with a spring micrometer. The micrometer can
be set to automatically subtract the two-mil (50 m) thickness of the non-deformable Mylar
backing.
An alternate procedure, in which a surface profile comparator is used, is available for
determining surface profile. Comparators include ISO, Clemtex, and Keane-Tator instruments.
Basically, they use a five-power illuminated magnifier to permit visual comparison of the blast-
cleaned surface to standard profile depths. Standards are available for sand, grit, and shot-blast
cleaned steel.
Another concern are the non-visible contaminants such as soluble salts, (e.g., chlorides and
sulfates). These salts are deposited from the environment, e.g., marine air, and industrial
pollutants. They can cause problems such as flash rusting of steel or blistering of applied paint
films. These contaminants are not removed by abrasive blast cleaning (or other mechanical
methods). A good indication of salt contamination on blast-cleaned steel is the rapid rerusting of
the steel in the absence of condensing moisture.
ASTM D 4940 provides a water extraction test procedure for determining salt concentration.
Extraction methods include swabbing, rigid limpet cell, and Bresle cell procedures. After
extraction, the water is tested for conductivity and/or specific salt ions. Test kits for analysis of
chloride, sulfate, and ferrous ions, as well as pH, are commercially available from suppliers of
coating instruments. They contain strips, swabs, papers, and operating instructions for simple
chemical testing.
Abrasive blast cleaned steel surfaces should be checked to determine if all the residual abrasive
has been removed by vacuuming, brushing, or blowing. Detection of residual abrasive can be
done by pressing a piece of transparent cellophane (Scotch) tape onto the cleaned steel and then
pulling it off. If any abrasive is visually detected on the piece of tape, further removal of abrasive
is required.
All blasted steel surfaces should be primed as soon as possible after cleaning, and always on the
same day except in dehumidified spaces. If not primed soon enough, particularly on humid days,
flash rusting of the steel may occur. If any flash rusting is observed, the steel must be reblasted.

PRE-COATING INSPECTION
Coating storage conditions
Mixing procedures
Thinning materials and amounts
Tinting, or color verification
Straining of coatings to remove large particles
Viscosity
Spray equipment check

INSPECTION OF COATING APPLICATION
Inspection during and after coating application consists chiefly of checking for:
Induction time and pot life
Wet and dry film thicknesses
Holidays
Adhesion
Curing
Cosmetic and film defects

INDUCTION TIME AND POT LIFE
For coatings that cure by chemical reaction (thermosetting), the inspector should check to see
that the manufacturers induction time and pot life requirements are met.

WET FILM THICKNESS
Wet film thickness (WFT) measurements should be made immediately after paint application to
determine if the coating is sufficiently thick to obtain the desired dry film thickness (DFT).
Measurement is less accurate on highly pigmented (e.g., zinc-rich) and quick-dry coatings. Since
measurement of WFT destroys the film integrity, the coating must be repaired after the
measurements have been completed. The most widely used type of WFT gauge, described in
ASTM D 4414, consists of a thin rigid metal notched gauge, usually with four working faces.
Each of the notches in each face is cut progressively deeper in graduated steps. The face with the
scale that encompasses the specified thickness is selected for use.
To conduct the measurement, the face is pressed firmly and squarely into the wet paint
immediately after its application. The face is then carefully removed and examined visually. The
WFT is the highest scale reading of the notches with paint adhering to it. Measurements should
be made in triplicate. Faces of gauges should be kept clean by removing the wet paint
immediately after each measurement.

DRY FILM THICKNESS
DFT measurements are made after complete curing of coatings to determine if specified
thicknesses have been met. Calibration of gauges and measurement of DFT by magnetic gauge
are described in detail in SSPC-PA2. Magnetic gauges are normally used for determining coating
DFT on steel surfaces. They rely on the fact that the thicker the coating, the smaller the magnetic
field above the coating. Typical measurement error may be 310 percent.
There are several factors that adversely affect DFT measurements with magnetic gauges. These
include:
Roughness of steel surface (deeper blasted surfaces result in higher measurements)
Steel composition (high alloy steels may have erroneous measurements)
Thickness of steel (there is a minimum thickness for gauge accuracy)
Curvature of steel surface (measurements may be erroneous)
Surface condition (contaminated coating surfaces may cause high readings; pull-off
magnets may adhere to tacky surfaces; probes may indent soft paints)
Orientation of gauge (must be held perpendicular to surface)
Other magnetic fields (strong magnetic fields from direct current welding or railway
systems may interfere)
All magnetic thickness gauges should be calibrated before use. It is also good practice to check
the calibration during and after use. Gauge suppliers provide a set of standard-thickness,
nonmagnetic (plastic or nonferrous metal) shims to cover their working ranges. The shim for
instrument calibration should be selected to match the desired coating thickness. It is placed on a
bare steel surface with the same profile that will be used for the coating application, and the
gauge probe is placed on it for calibration. If the instrument does not agree with the shim
measure, it should be properly adjusted. If adjustment is difficult, the reading for bare steel can
be added or subtracted from field readings to determine actual thicknesses. The steel surface
used for calibration should be a masked-off area of the steel being painted or an unpainted
reference panel of similar steel, if possible.
Another calibration system utilizes a set of small, chrome-plated steel panels of precise
thickness, available from the National Institute of Standards and Technology (formerly the
National Bureau of Standards). These standards are expensive but very accurate. SSPC-PA 2
presents detailed information on the calibration and use of both pull-off and fixed probe gauges.

HOLIDAY DETECTION
Newly coated structures on which the coating integrity is important (particularly linings or
coatings in immersion conditions) should be tested with a holiday detector to ensure coating film
continuity. A holiday (sometimes called discontinuity) is a pinhole or other break in the film that
permits the passage of moisture to the substrate. This allows substrate deterioration to begin.
Holidays are not easily detected visually, and must be located with electrical instruments called
holiday detectors. Holiday detectors are available in two types, low and high voltage, as
described in ASTM D 5162.
Low-voltage (30 to 70 volts) holiday detectors are used on coatings up to 20 mils (500 m) in
thickness. These portable devices have a power source (a battery), an exploring electrode (a
dampened cellulose sponge), an alarm, and a lead wire with connections to join the instrument to
bare metal on the coated structure. A wetting agent that evaporates on drying should be used to
wet the sponge for coatings greater than 10 mils (250 m) in thickness. The wetted sponge is
slowly moved across the coated surface so that the response time is not exceeded. When a
holiday is touched, an electric circuit is completed through the coated metal and connected wire
back to the instrument to sound the alarm. Holidays should be marked after detection for repair
and subsequent retesting.
High-voltage (above 800 volts) holiday detectors are used on coatings greater than 20 mils (500
m) in thickness. The exploring electrode may consist of a conductive brush or coil spring. The
detector may be a pulse or direct current type. It should be moved at a rate not to exceed the
pulse rate. If a holiday or thin spot in the coating is detected, a spark will jump from the
electrode through the air space to the metal.

Você também pode gostar