The Psychrometer Duel: Old School vs. New School
Why do we need to measure ambient conditions?
Ambient conditions are the prevailing conditions of air temperature, the moisture content of the air (relative humidity), and the temperature at which condensation will occur (dew point). Most coating specifications have set requirements for monitoring and documenting results for surface and air temperature, relative humidity and dew point. These conditions are to be measured and recorded in the specific areas where surface preparation and coating application will occur, then compared to the specified ranges and/or the coating manufacturer’s restrictions listed on the product data sheet.
While theoretically a surface temperature only slightly above the dew point temperature would preclude condensation, the 5°F safety factor accounts for instrument inaccuracies and changing or varying conditions.
You should not rely on prevailing conditions from a local weather service or from the internet as conditions at the project site and the specific work area can vary considerably. And surface temperature won’t be reported. Ambient conditions should be measured where the work will occur and recorded prior to start-up of operations and at 4-hour intervals thereafter, unless conditions appear to be changing. In this case, more frequent checks may be required.
Using Instruments for Assessing Prevailing Conditions
Whirling (Sling) Psychrometer: When discussing the measurement of ambient conditions using a whirling psychrometer (ASTM E337, Standard Test Method for Measuring Humidity with a Psychrometer (the Measurement of Wet- and Dry-Bulb Temperatures), you hear the terms wet bulb temperature and dry bulb temperature used on a regular basis, but how are these terms defined? Wet bulb temperature is an indication of the latent heat loss caused by water evaporating from a wetted sock or wick on the end of a bulb thermometer mounted in the psychrometer housing. While whirling the instrument away from your body in 20-30 second increments, the water evaporates from the wetted sock into the air, so there is a cooling effect on the thermometer causing a decrease in temperature. This process is repeated until two temperature readings from the wet bulb thermometer are with 0.5° of one another. The depression of the wet bulb thermometer from the dry bulb (air) thermometer is the calculated difference between the air temperature and the stable wet bulb temperature. For example, a dry-bulb temperature of 70°F and a wet-bulb temperature of 60°F nets a difference of 10°F. this is known as the wet-bulb depression.
Psychrometric tables are used to look-up the relative humidity and dew point temperature. First choose the table of interest (relative humidity or dew point temperature, then select the table corresponding to the prevailing barometric pressure for the geographical location that the project is in. Intersect the dry bulb (air temperature) with the difference between the dry and wet bulb temperatures, known as the depression of the wet bulb to determine the relative humidity or dew point temperature. A separate thermometer is used to measure the temperature of the surfaces to be prepared and/or coated. The temperatures and the relative humidity can then be compared to the requirements listed in the specification to determine conformance.
Digital Psychrometer: The use of a digital psychrometer for assessing prevailing ambient conditions and surface temperature is a much simpler process compared to the use of a whirling psychrometer, psychrometric charts and surface temperature thermometer. Most of the digital psychrometers will display the relative humidity, air temperature, surface temperature, dew point temperature and the difference (spread) between surface temperature and dew point temperature. Data are constantly updated and displayed simultaneously for easy recognition. This eliminates the need to use psychrometric tables to determine the relative humidity and dew point temperature, as well as any need for a separate surface temperature thermometer. The data can be auto-logged and uploaded to cloud-based software or downloaded to a device using USB or Blue Tooth®.
Which Method Wins the Duel?
Whirling psychrometers were first invented in the 1600’s (see image to right), and the US Weather Bureau Psychrometric Tables were first published in 1941. So, one may conclude that newer technology wins the duel. Not so fast! Digital psychrometers also have limitations and without user knowledge they too can produce erroneous data.
While having all the ambient conditions and the temperature of the surface readily displayed is a great benefit, there are important steps that must be followed when using these electronic instruments. It is very important that the digital psychrometer be allowed to ‘stabilize’ to the atmospheric conditions where the work is occurring. This could take anywhere from 20 to 30 minutes. That is, accurate readings are not possible immediately after departing an air-conditioned vehicle and walking onto the jobsite. Additionally, the humidity sensor used by most instrument manufacturers has a tendency to dry out during periods of inactivity, resulting in false, low humidity readings. To re-saturate the sensor, the manufacturers recommend placing the probe of the digital psychrometer in a re-sealable plastic bag or sealed container with a damp (not wet) cotton cloth for 24-hours. This will extend the life of the sensor and help ensure representative readings. And most instrument manufacturers recommend annual calibration.
Whirling psychrometers also have their limitations and the potential mis-use. These instruments cannot be used in freezing temperatures and proper use (thorough saturation of the wick with deionized water and reading the wet-bulb temperature after several 20-30 second increments of whirling until the wet bulb temperature stabilizes) is very important.
Despite the availability and apparent convenience of the digital psychrometers, many quality control and quality assurance personnel still rely on older “tried and true” technology. Both will work well when used properly.
The Psychrometer Duel: Old School vs. New School
Why do we need to measure ambient conditions?
Ambient conditions are the prevailing conditions of air temperature, the moisture content of the air (relative humidity), and the temperature at which condensation will occur (dew point). Most coating specifications have set requirements for monitoring and documenting results for surface and air temperature, relative humidity and dew point. These conditions are to be measured and recorded in the specific areas where surface preparation and coating application will occur, then compared to the specified ranges and/or the coating manufacturer’s restrictions listed on the product data sheet.
While theoretically a surface temperature only slightly above the dew point temperature would preclude condensation, the 5°F safety factor accounts for instrument inaccuracies and changing or varying conditions.
You should not rely on prevailing conditions from a local weather service or from the internet as conditions at the project site and the specific work area can vary considerably. And surface temperature won’t be reported. Ambient conditions should be measured where the work will occur and recorded prior to start-up of operations and at 4-hour intervals thereafter, unless conditions appear to be changing. In this case, more frequent checks may be required.
Using Instruments for Assessing Prevailing Conditions
Whirling (Sling) Psychrometer: When discussing the measurement of ambient conditions using a whirling psychrometer (ASTM E337, Standard Test Method for Measuring Humidity with a Psychrometer (the Measurement of Wet- and Dry-Bulb Temperatures), you hear the terms wet bulb temperature and dry bulb temperature used on a regular basis, but how are these terms defined? Wet bulb temperature is an indication of the latent heat loss caused by water evaporating from a wetted sock or wick on the end of a bulb thermometer mounted in the psychrometer housing. While whirling the instrument away from your body in 20-30 second increments, the water evaporates from the wetted sock into the air, so there is a cooling effect on the thermometer causing a decrease in temperature. This process is repeated until two temperature readings from the wet bulb thermometer are with 0.5° of one another. The depression of the wet bulb thermometer from the dry bulb (air) thermometer is the calculated difference between the air temperature and the stable wet bulb temperature. For example, a dry-bulb temperature of 70°F and a wet-bulb temperature of 60°F nets a difference of 10°F. this is known as the wet-bulb depression.
Psychrometric tables are used to look-up the relative humidity and dew point temperature. First choose the table of interest (relative humidity or dew point temperature, then select the table corresponding to the prevailing barometric pressure for the geographical location that the project is in. Intersect the dry bulb (air temperature) with the difference between the dry and wet bulb temperatures, known as the depression of the wet bulb to determine the relative humidity or dew point temperature. A separate thermometer is used to measure the temperature of the surfaces to be prepared and/or coated. The temperatures and the relative humidity can then be compared to the requirements listed in the specification to determine conformance.
Digital Psychrometer: The use of a digital psychrometer for assessing prevailing ambient conditions and surface temperature is a much simpler process compared to the use of a whirling psychrometer, psychrometric charts and surface temperature thermometer. Most of the digital psychrometers will display the relative humidity, air temperature, surface temperature, dew point temperature and the difference (spread) between surface temperature and dew point temperature. Data are constantly updated and displayed simultaneously for easy recognition. This eliminates the need to use psychrometric tables to determine the relative humidity and dew point temperature, as well as any need for a separate surface temperature thermometer. The data can be auto-logged and uploaded to cloud-based software or downloaded to a device using USB or Blue Tooth®.
Which Method Wins the Duel?
Whirling psychrometers were first invented in the 1600’s (see image to right), and the US Weather Bureau Psychrometric Tables were first published in 1941. So, one may conclude that newer technology wins the duel. Not so fast! Digital psychrometers also have limitations and without user knowledge they too can produce erroneous data.
While having all the ambient conditions and the temperature of the surface readily displayed is a great benefit, there are important steps that must be followed when using these electronic instruments. It is very important that the digital psychrometer be allowed to ‘stabilize’ to the atmospheric conditions where the work is occurring. This could take anywhere from 20 to 30 minutes. That is, accurate readings are not possible immediately after departing an air-conditioned vehicle and walking onto the jobsite. Additionally, the humidity sensor used by most instrument manufacturers has a tendency to dry out during periods of inactivity, resulting in false, low humidity readings. To re-saturate the sensor, the manufacturers recommend placing the probe of the digital psychrometer in a re-sealable plastic bag or sealed container with a damp (not wet) cotton cloth for 24-hours. This will extend the life of the sensor and help ensure representative readings. And most instrument manufacturers recommend annual calibration.
Whirling psychrometers also have their limitations and the potential mis-use. These instruments cannot be used in freezing temperatures and proper use (thorough saturation of the wick with deionized water and reading the wet-bulb temperature after several 20-30 second increments of whirling until the wet bulb temperature stabilizes) is very important.
Despite the availability and apparent convenience of the digital psychrometers, many quality control and quality assurance personnel still rely on older “tried and true” technology. Both will work well when used properly.
Calculating and Measuring Wet Film Thickness
Wet film thickness, or WFT is the measured thickness of any applied wet paint that is liquid-based. A wet film thickness gage should be used by the applicator as the coating is being applied to ensure that the measurement is representative of the calculated wet film before significant solvent evaporation occurs. Even slight delays in taking wet film thickness measurements can result in false low readings, since the solvents may have evaporated from the film before the measurements are acquired, which is why a WFT gage is largely regarded as an applicators tool rather than an inspector gage.
Why is WFT important? Measuring the WFT of a coating enables the applicator to adjust the spray gun speed, number of spray passes and to make spray gun adjustments (when possible) or select other spray tips to apply the correct amount of coating to achieve the specified dry film thickness.
What is the relationship between WFT and production? Time is money. The need to apply a build-up coat or even worse the need to reduce thickness (by sanding) can impede production and reduce profitability. While under-thickness can frequently be corrected by adding more coating, excessive thickness can cause solvent entrapment, runs and sags and, if uncorrected can lead to adhesion problems. Each of these events can negatively impact a project schedule. Perhaps more importantly, the performance properties of most coatings are based on achieving the specified dry coating thickness, and applying the correct wet film thickness can help to meet this requirement. Measuring wet film thickness during application immediately identifies the need for in-process adjustments by the applicator.
How is the WFT calculated? The coating manufacturer may indicate the range of wet film thickness to be applied to achieve the desired dry film on the product data sheet (PDS). However, many manufacturers only list the recommended DFT since the amount of thinner that will be added by the contractor is unknown and that amount effects the target WFT. Specifications typically list the desired end-result (the DFT) and not the means/methods of achieving it (the WFT). The wet film thickness target (or range) can be calculated. The equations for calculating the WFT, both with and without thinner addition, are shown. The dry film thickness range (per coat) is extracted from the specification or the coating manufacturer’s PDS (whichever is the governing document) and the volume solids content is listed on the PDS.
What are volume solids in paint? The volume solids content of a coating is an expression of the film-forming ingredients, or the material left behind after the solvents have evaporated from the applied coating. On a very basic level paint contains solvent, resin, pigments, and additives. The volume solids content is the percentage of the formulation that is non-volatile and will remain on the surface after the coating dries and cures.
Without thinner:
Wet Film Thickness (WFT) = Dry Film Thickness (DFT) ÷ Percent Solids by Volume
Example:
Specified Dry Film Thickness = 3 – 5 mils
Volume Solids Content = 65% (0.65)
WFT = 3 ÷ 0.65 = 4.6 mils; 5 ÷ 0.65 = 7.7 mils
Based on this example, provided the applicator applies between 5 and 8 mils WFT, the specified DFT of 3-5 mils should be achieved.
The equation for calculating the target WFT with thinner added in the shop or field requires that the volume solids content of the coating (as manufactured) be adjusted based on the volume of thinner added, as a percentage of the total volume of coating.
With thinner:
WFT = DFT ÷ (Volume solids content ÷ 100% + % of thinner added)
Example:
Specified Dry Film Thickness = 3 – 5 mils
Volume Solids = 65%
Thinner by Volume = 20% (e.g. added 2 gallons of thinner to 10 gallons of mixed coating)
Step 1: (0.65 ÷ 1.20) = 0.54, or 54% adjusted volume solids based on 20% thinner addition
Step 2: 3 ÷ (0.54) = 5.6 mils 5 ÷ (0.54) = 9.3 mils
Based on this example, provided the applicator applies between 6 and 10 mils WFT, the specified DFT of 3-5 mils should be achieved, even with 20% thinner, which is part of the wet film but not the dry film.
How is WFT measured? Wet film thickness gages are used to monitor the thickness of the applied wet coating to achieve a specified dry film thickness. They measure all types of wet organic coatings, such as paint, varnish, and lacquer on flat or curved, smooth surfaces. The units of measure for these gages is typically micrometers (microns) or mils. Wet film thickness is measured according to ASTM D4414, Standard Practice for Measurement of Wet Film Thickness by Notch Gages. Pictures are gages commonly used to measure WFT.
How is a WFT gage used? Using a WFT gage is quite simple. First, verify that the notches (teeth) are clean and free of any dry paint. Immediately insert the end of the gage perpendicularly into the wet coating. The two end teeth will penetrate down to contact the underlying surface and will be wetted with coating. Withdraw the gage and read the highest wetted step. If none of the numbered notches contain wet coating, rotate the head of the gage to a lower WFT range and remeasure. If all of the numbered notches contain wet coating, rotate the head of the gage to a higher WFT range and remeasure. A diagram describing the proper use of a WFT gage is shown. The teeth of the gauge must be wiped off after every individual reading.
What units are used when measuring WFT? Typical units of measure are mils and microns. A mil is a unit of length equal to one thousandth (10−3) of an inch (0.0254 millimeter). A micron is a metric unit of measure for length equal to 0.001 mm, or about 0.000039 inch. Its symbol is µm. 25.4 microns is equal to 1 mil.
Conclusion: Calculating and properly measuring wet film thickness can reduce rework, improve productivity, and help ensure a properly applied coating. The proper use of a WFT gage by the applicator is critical to achieving the desired dry film thickness.
Using Scanning Probe Technology to Measure Coating Thickness
Introduction
Coating thickness measurement is one of the most common quality assessments made during industrial coating applications. SSPC-PA 2, Procedure for Determining Conformance to Dry Coating Thickness Requirements is frequently referenced in coating specifications. As SSPC-PA 2 has evolved over the past four decades, a number of procedures and measurement frequencies are referenced in both the mandatory portions of the standard and in the non-mandatory appendices. While the measurement frequencies were never intended to be a statistical process, it is helpful to understand the statistical implications of the measurement process. And it is helpful to know what coating thickness variability is reasonable. This brief article explores how scanning probe technology can help to acquire a larger number of measurements (in a relatively short period of time) to better assess the consistency of the applied coating thickness, particularly on larger, more complex structures.
Background
Scanning Illustration, courtesy of Elcometer Ltd.
There are two industry standards that are widely specified for measurement of coating thickness. These include ASTM D7091, Standard Practice for Nondestructive Measurement of Dry Film Thickness of Nonmagnetic Coatings Applied to Ferrous Metals and Nonmagnetic, Nonconductive Coatings Applied to Non-Ferrous Metals and SSPC-PA 2, Procedure for Determining Conformance to Dry Coating Thickness Requirements. The ASTM standard focuses on gage use, while the SSPC standard focuses on the frequency and acceptability of coating thickness measurements. The standards are designed to be used in conjunction with one another. In 2012, all references to measurement frequency were removed from the ASTM standard so that it did not conflict with SSPC-PA 2.
The frequency of coating thickness measurements is defined by gage readings, spot measurements and
FXS Probe designed to withstand rough surfaces, courtesy of DeFelsko Corporation
area measurements. A minimum of three (3) gage readings is obtained in a 1.5” diameter circle and averaged to create a spot measurement. Five spot measurements are obtained in a 100-square foot area. The number of areas to be measured is determined by the size of the coated area. If less than 300 square feet are coated (i.e., during a work shift), then each 100-square foot area is measured (maximum of three areas, each composed of five spot measurements with a minimum of three gage readings in each spot). If the size of the coated area is between 300 and 1000 square feet, three – 100 square foot areas are selected and measured. If the size of the coated area exceeds 1000 square feet, three areas are measured in the first 1000 square feet, with one additional area measured in each additional 1000 square feet, or portion thereof. For example, if the size of the coated area is 4500, square feet, 7 – 100 square foot areas are measured (total of 35 spot measurements and minimum of 105 gage readings).
Other measurement frequencies are included in non-mandatory appendices to SSPC-PA 2, including Appendix 2 & 3 for steel beams, Appendix 4 & 5 for test panels, Appendix 6 for measurement of coating thickness along edges and Appendix 7 for pipe exteriors.
Gauge display containing scanned data, courtesy of Elcometer Ltd.
The number of gage readings, spot measurements and area measurements prescribed by SSPC-PA 2 was never intended to be based on a statistical process. Rather, the frequency of measurement was based on what was reasonable in the shop or field to adequately characterize the thickness of the coating without unduly impeding production. Consider the impact of checking the thickness of a previous day’s application to 4,000 square of steel if every 100 square feet needed to be measured. That’s 40 areas, 200 spot measurements a minimum of 600 gage readings. And that frequency may not be considered a statistically significant sampling either. Further, obtaining additional measurements above the number prescribed by SSPC-PA 2 (when invoked by contract) may be considered “over inspection.”
Using Scanning Technology to Acquire Higher Volumes of Data
Several manufacturers of electronic coating thickness gages have incorporated “scanning probe” technology and the associated support software into the data acquisition process. This newer technology enables the gage operator to obtain large sets of coating thickness data in a relatively short time frame. For example, coating thickness data was obtained by a certified coatings inspector on an actual bridge recoating project that included 12 batches of readings (nearly 600 readings) in just under 8 minutes (measurement time only) on bridge girders across four panel points. So it may be possible to obtain a more representative sampling of the coated area without impeding production. However, there are concerns with acquiring such large data sets, such as management of the data, handling outliers, determining the statistical significance of the data (i.e., what is an acceptable standard deviation or coefficient of variation), applicability of the Coating Thickness Restriction Levels 1-5 in SSPC-PA 2), etc. The scanning probe set-up on the gage itself is relatively easy to perform, and the software is capable of handling the large volume of data coming into the gages.
The SSPC Committee on Dry Film Thickness Measurement may consider adding a 10th non-mandatory appendix to SSPC-PA 2 to give the specifier the option of acquiring a much larger data set of coating thickness measurements without impeding production. In this manner, an owner may gain greater confidence regarding the uniformity and consistency of the applied coating film.
Surface Profile Measurement Options on Rough/Pitted Steel Surfaces
Introduction
Surface profile is defined as a measurement of the maximum peak-to-valley depth generated by abrasive blast cleaning and impact-type power tools. These operations effectively increase the surface area and provide an “anchor” for the applied coating system. The surface profile depth must be compatible with the total coating system thickness; typically, the thicker the coating system, the deeper the surface profile. For example, a 3-coat 15 mil system may require a 2-3 mil surface profile, while a 40-mil coating system may require a 4-5 mil surface profile. The maximum achievable surface profile is generally 6-7 mils (in steel) using a G10 or G12 abrasive.
Abrasive blast cleaned and power tool cleaned steel surfaces are routinely checked to verify the specified surface profile has been achieved. Industry standards such as ASTM D4417, Standard Test Methods for Field Measurement of Surface Profile of Blast Cleaned Steel[1], NACE International SP0287, Standard Practice for Field Measurement of Surface Profile of Abrasive Blast-Cleaned Steel Surfaces Using Replica Tape, and SSPC: The Society for Protective Coatings PA 17, Procedure for Determining Conformance to Steel Profile/Surface Roughness/Peak Count Requirements describe the procedures for performing these measurements, as well as the recommended frequency of measurements and acceptability of the values. However, the standards assume smooth steel was prepared; little is written about measurement of surface profile on rough or irregular surfaces such as pitted steel[2], weathering steel, or cast iron surfaces. This brief article describes a few methods that may be considered for measuring surface profile on these types of irregular surfaces.
Background
Many steel structures that have been in service for relatively long periods of time may have irregular rough surfaces due to corrosion. Often this results in steel thickness loss (section loss), and may even require modification or replacement. But when it is determined that not enough metal loss has occurred to warrant repairs to the steel substrate, the applicator is faced with complying with contract requirements for cleanliness and profile generation on surfaces that have a roughness that often exceeds the surface profile requirements of the contract. Likewise, other steel surfaces such as cast iron and weathering steel (ASTM A588, A242, A606-4, A847, and A709-50W) typically have a rougher surface than abrasive blast cleaned ASTM A36 steel after they have weathered from atmospheric conditions, and may result in a higher surface profile yield than the specification allows and an ensuing nonconformance.
Measuring surface profile on rough or pitted surfaces can often lead to false high readings, since the measurements are indicative of the depth of the pits or the inherent roughness of the steel versus the surface profile generated by the abrasive or impact-type power tool itself. This begs the question, “how do you verify the surface profile on these types of surfaces with any degree of accuracy?” There are a few alternatives that can be considered; however, they should be discussed, and an approach negotiated during the preconstruction meeting rather than during in-process measurement, when possible.
Alternative Methods
The first alternative is to obtain measurements in an adjacent area (that is not rough) using whichever method has been selected/specified (depth micrometer or replica tape). However, this may not be feasible when the pitting or rough steel is uniform. Of the three methods listed in the referenced standards, the depth micrometer (Method B in ASTM D4417) is generally considered optimum in these situations because a measurement of a single valley can be obtained, and the upper range of the instrument is higher (20 mils) than the maximum value that can be reasonably measured using replica tape (5 mils). Multiple measurements (a minimum of ten) are made in an area and the average surface profile is calculated[3].
Another option is to rely on a visual comparator and a reference disc. The comparator is a lighted magnifier (typically 5-10x power) that enables the user to closely examine the surface roughness and compare it to replica discs containing varying degrees of roughness (5 segments per disc). The appropriate reference disc that represents the abrasive employed (grit/slag or shot) is placed on the prepared steel and the user selects the segment that most closely matches the surface profile of the steel.
A third option is to measure the surface profile on a companion piece of steel such as a test plate that is abrasive blast cleaned with the same abrasive and pressure being used on the rough steel. This procedure has been accepted in the nuclear power industry for many years when painting cast iron motor housings.
Lastly, the abrasive manufacturer can be consulted regarding the typical surface profile values produced by the type and size abrasive being used. Some abrasive manufacturers can provide a Certificate of Conformance that states the measured range for a given lot under laboratory conditions. Note that surface hardness greatly influences surface profile depth, so the abrasive manufacturer’s data may be misleading.
Conclusion
The important point to remember is that when the surface is rough or irregular, one or more of these methods can be used to more accurately determine the surface profile depth. Further, rough surfaces may require the application of a thicker coat, or additional coating layers to help ensure corrosion protection. The coating manufacturer should be engaged when making these decisions.
[1] According to SSPC-SP 11 and SP 15, verification of minimum 1-mil surface profile created by power tools can only be measured using Method B (depth micrometer) described in the ASTM D4417 standard.
[2] SSPC-PA 17-2012 addresses measurement of surface profile on pitted steel in Appendix C (Section C2.5.3)
[3] ASTM D4417 instructs the operator to report the maximum of the ten measurements; however, this is not recommended on rough/pitted surfaces. The standard does allow averaging of the readings.
Coating Adhesion Testing using Knife/Tape Methods
Introduction
Adhesion is an important physical characteristic of applied coating films and systems, and testing is frequently used as an indicator to whether an adequate bond exists between the substrate and primer (first coat) and/or between coats in multiple coat applications. Adhesion testing may be a requirement of a coating specification, or may be used for coating system performance qualification in a laboratory. Adhesion testing is also a valuable indicator for determining the integrity of coating systems that have been in service for extended periods of time, may require maintenance, and overcoating is a strategy being considered; and is frequently used during coating failure investigations. Irrespective of the application of the test, there are standard test methods (procedures) for conducting adhesion testing that should be followed to ensure consistency, especially when performing comparative analyses. This article discusses tape and knife adhesion test methods performed according to standardized ASTM International. Tensile (pull-off) adhesion is the subject of an article by Melissa Swogger that is also available on the KTA University site.
Test Methods
ASTM D3359, Standard Test Methods for Measuring Adhesion by Tape Test (Tape Test) and ASTM D6677, Standard Test Methods for Evaluating Adhesion by Knife (Knife Test) are perhaps the most widely used tests to evaluate a coatings’ adhesion to the substrate and to other coats in a multi-coat system. While tape and knife adhesion tests are generally regarded as more subjective than their tensile (pull-off) adhesion test counter parts, the tape and knife adhesion tests can be much more revealing of the true adhesion properties of a coating system. Experience has shown that high pull-off adhesion values can be achieved on a coating system that is easily lifted with a knife, tape or in some cases one’s fingers. This is primarily due to the directional forces applied to the coating system during the tests.
Adhesion testing performed according to ASTM D3359 or ASTM D6677 apply shear forces to the coating, while the pull-off adhesion tests (performed according to ASTM D4541 or ASTM D7234)[1] use tensile [perpendicular] forces. The shear tests are oftentimes more definitive because they better replicate the way in which coatings fail. That is, coatings generally do not disbond from a substrate or other coating as a result of forces that are exerted perpendicular to the surface.[2] Rather, the coating peels off of the substrate or another coat because of shear (non-perpendicular) forces exerted on the coating system. Undercutting and peeling can occur as a result of shear forces.
Tape Adhesion Tests
Two test methods are described in ASTM D3359: Method A (X-cut), and Method B (Cross-cut). Test Method A is primarily intended for use on coatings/coating systems over 5 mils (125µm, while Method B is generally used on coatings/coatings systems less than 5 mils thick Either method can be performed in the shop, field or laboratory.
The test was developed for assessing the adhesion of coating to steel, but can be used on other hard substrates. The test has also been used successfully on softer substrates (e.g., wood and plaster).
Both tests are performed by scribing the coating to the substrate with a sharp knife blade in a specific pattern, applying a pressure sensitive tape and then rapidly pulling the tape from the surface. When the coating is greater than 5-mils thick an X-cut (with each leg approximately 1.5-inches long) is made in the film. When the coating is less than 5-mils thick, a cross-cut lattice pattern is created with either six or eleven cuts in each direction. For coatings up to 2.0 mils thick, eleven incisions are made that are spaced 1 mm apart. For coatings between 2.0 mils and 5.0 mils thick, six incisions are spaced 2 mm apart. For both methods, a steel or other hard metal straightedge or template is recommended to ensure straight cuts and, in the case of the X-cut, the correct angle at the intersection (30-45°).
Once the incisions are made, a pressure sensitive tape (with adhesive properties conforming to the requirements of the standard; Figure 1) is applied over the incisions and pressed in place using a pencil eraser. Following a brief “recovery” period of about 60 seconds the tape is removed by grasping the free end of the tape and pulling it off rapidly (not jerked) back upon itself at as close to an angle of 180° as possible. After removal of the tape, the amount of coating removed from the substrate or underlying coating is rated. It is important to evaluate the coated surface and not the back of the tape, since coating debris from the incisions is often removed by the tape.
Adhesion is rated based on the scale provided in the ASTM standard. The scale ranges from 0 “Removal beyond the area of the incisions” to 5 “No peeling or removal.” When Method A is used an “A” is included after the numerical adhesion value (e.g., 3A). Similarly, a “B” is added after the numerical value when Method B is used (e.g., 3B). Table 1 provides the evaluation criteria for Method A; Table 2 provides the evaluation criteria for Method B. The standard also contains a pictorial guide to aid in the rating of the cross-cut (Method B).
[1] ASTM D4541, Standard Test Method for Pull-Off Strength of Coatings Using Portable Adhesion Testers and ASTM D7234, Standard Test Method for Pull-Off Strength of Coatings on Concrete Using Portable Adhesion Testers
[2] The exception to this is osmotic blistering, where the coating is pushed off the surface as a result of vapor pressure from below the coating. However, the subsequent delamination is a result of shear forces.
When appropriate, the nature and location of the separation is documented. A cohesive separation is one that occurs within a coating layer; an adhesive separation is one that occurs between coating layers or between the coating and the substrate. Generally, adhesion ratings of 4 and 5 are considered good, adhesion values of 2 and 3 are considered marginal and adhesion values of 0 and 1 are considered poor.
Knife Adhesion Tests
Similar to the tape adhesion tests, the Standard Test Method for Evaluating Adhesion by Knife (ASTM D6677) can be used to evaluate coating adhesion to steel and other hard substrates. Precautions are included regarding the use of the test on coatings with a high cohesive strength that may appear to have worse adhesion than one that is brittle and fractures easily. In addition, the method is not to be used on overly thick coatings that cannot be cut to the substrate with a utility knife blade in one stroke.
The knife adhesion test is conducted similarly to Method A of the tape adhesion tests in that incisions are made in the shape of an “X” (each leg 1.5-inches in length, with an angle of 30o – 45o) through the coating film down to the substrate. The tip of a knife blade is then inserted into the intersect of the two incisions and used to attempt to lift the coating from the substrate or an underlying coating.
Adhesion is rated on an even number scale between 0 and 10, with 10 having the best adhesion and 0 the worst. A description of the adhesion criteria is included in Table 3.
Generally, adhesion ratings of 8 and 10 are considered good, adhesion values of 4 and 6 are considered marginal, and adhesion values of 2 and 0 are considered poor.
Interpreting Adhesion Test Results
The Tape and Knife adhesion test procedures described herein include specific language to address “gray areas,” which require agreement between parties that are either requiring or performing the tests. There are circumstances and situations that do not allow standard procedures and methods to provide an accurate representation of coating adhesion. For example, laboratory testing is typically conducted under “standard” laboratory conditions of temperature and humidity; however, field testing conditions vary with the prevailing weather and are largely uncontrolled. Variations in temperature and humidity can affect the efficacy of the method employed.
Heavily chalked paints typically show very good tape adhesion properties, since only the friable chalk layer is removed by the tape (the weakest plane), leaving the coating system intact. The adhesion by knife test may provide a more accurate picture of the actual adhesion characteristics. If the tape adhesion test is required, the chalking should be removed from the area prior to performing the test.
Adhesion testing conducted on acrylic elastomeric coatings applied to cement or stucco cannot be evaluated using the tape adhesion test. Further, the results of any knife adhesion tests performed on these coatings must be carefully considered. Acrylic elastomeric coatings have high cohesive strength and, once cut, can often be removed by pulling on the leading edge with one’s fingers. Never the less, the adhesion is oftentimes considered to be acceptable under these conditions.
Adhesion tests that consistently reveal an adhesive break between coats or a cohesive break within a coat do not provide any information relative to the adhesion of the coating (or coating system) to the substrate. Knife adhesion tests may be used to assess the bond to the substrate when the tape adhesion test results revealed a break somewhere higher up in the coating system.
Simply stated, the ASTM standard test procedures have limitations that need to be considered when making judgements or decisions based on the test results. Make sure that any “gray areas” are considered and addressed by the stakeholders before testing is performed.
What is a Base Metal Reading and How Does It Effect Coating Thickness Measurements?
For many in the health care/fitness industry, BMR is an acronym for basal metabolic rate. Sorry to disappoint if you thought this would be a health science article about expending energy. Rather, this article is about a different BMR: Base Metal Reading. We’ll describe what it is, its significance, how to obtain it, and how it impacts coating thickness.
Introduction to Coating Thickness Standards
There are two common industry standards that govern measurement of coating dry film thickness on metal substrates, including ASTM D7091, Standard Practice for Nondestructive Measurement of Dry Film Thickness of Nonmagnetic Coatings Applied to Ferrous Metals and Nonmagnetic, Nonconductive Coatings Applied to Non-Ferrous Metals and SSPC-PA 2, Procedure for Determining Conformance to Dry Coating Thickness Requirements. Both address the use of Type 1 (magnetic pull-off) and Type 2 (electronic) gages as well BMR acquisition. SSPC-PA 2 also addresses measurement frequency and the acceptability of the measurements.
What is BMR?
BMR is the effect of substrate roughness on a coating thickness gage. The roughness is created by preparation of the substrate (e.g., abrasive blast cleaning or power tool cleaning), which generates a surface texture or “profile,” or by a manufacturing process that imparts roughness into the substrate. Instruments that measure the dry film thickness of the applied coating reach part way down into the roughened metal surface to operate properly (illustrated by the red line). However, specifications list the required coating thickness as measured from the tops of the peaks of the surface profile (illustrated by the blue bar). This inherent delta is known as the base metal effect. It is deducted from the coating thickness measurements to eliminate any effect of surface roughness. If the BMR is ignored, the thickness of the coating from the tops of the peaks of the surface profile may be overstated.
Acquisition of a BMR is not predicated on the gage type (Type 1 magnetic pull-off verses Type 2 electronic), but rather the way the gage is set-up by the operator to compensate for surface roughness. For both Type 1 (see photo, left) and Type 2 gages a BMR may be acquired and deducted from the coating thickness.
As an alternative, for Type 2 gages one or more measured shims (one shim is considered a one-point adjustment while the use of two shims spanning the range of intended use is considered a two-point adjustment) may be placed onto the prepared (roughened) metal surface and the gage adjusted to correspond to the shim thickness, effectively removing any need to measure and deduct a BMR. According to SSPC-PA 2, these measured shims are not permitted to be used with Type 1 gages unless explicitly allowed by the gage manufacturer, so in most cases a BMR will be required when using a Type 1 gage.
Obtaining Base Metal Readings
Section 6.2 in SSPC-PA 2 states, “To compensate for any effect of the substrate itself and surface roughness, obtain measurements from the bare, prepared substrate at a minimum of ten locations (arbitrarily spaced) and calculate the average value. This average value is the base metal reading.” Here are the steps:
- Verify the accuracy of the coating thickness gage before use. Traceable coated standards are required for both Type 1 and Type 2 coating thickness gages.
- Obtain a minimum of ten readings on the prepared, uncoated substrate in random locations. To avoid forgetting to acquire a BMR, it is best to take the measurements at the same time surface profile measurements are obtained.
- Measure the coating thickness.
- Deduct the average BMR.
The BMR is not only deducted from the primer thickness, but the cumulative layer thickness measurements as they are obtained. This is illustrated below:
Measured primer thickness: ———————————————————- 4.9 mils
BMR: ———————————————————————————— (0.6 mil)
Actual primer thickness from the top of the peaks of the surface profile: —— 4.3 mils
Cumulative primer & topcoat thickness: ——————————————— 9.2 mils
BMR: ————————————————————————————- (0.6 mil)
Actual cumulative thickness from the top of the peaks of the surface profile: — 8.6 mils
It is important to recognize that BMR and surface profile are related, but they are not the same. Surface profile is a measurement of the maximum peak-to-valley depth created by abrasive blast cleaning or some type of impact power tool. It is measured using one of three methods described in ASTM D4417, Standard Test Methods for Field Measurement of Surface Profile of Blast Cleaned Steel and SSPC-PA 17, Procedure for Determining Conformance to Steel Profile/Surface Roughness/Peak Count Requirements. BMR is the effect of this surface profile on a coating thickness gage. A 3-mil surface profile may have an associated BMR of 0.7 mil. Deducting surface profile from the coating thickness instead of the BMR will result in a significant under-recognition of the actual coating thickness.