CENTRAL REGION APPLIED RESEARCH PAPER 17-13


Comparison of ASOS and Manual Observations at Rochester, Minnesota

Anthony T. Teal
National Weather Service Office
Rochester, Minnesota


INTRODUCTION

The Automated Surface Observation System (ASOS) was installed at the Rochester International Airport at Rochester, Minnesota (Figure 1) on July 25, 1994, and was accepted on August 1, 1994. A comparison was done between the observations of ASOS and the observations taken by the National Weather Service employees using the Microcomputer-Aided Paperless Surface Observation (MAPSO). The comparison would help determine any major discrepancies between the two methods of observation, and furthermore help the commissioning stages of the ASOS. The purpose of this paper is to discuss the results of this comparison.

Figure 1. Rochester, Minnesota, International Airport (RST).

DISCUSSION

The program ASOSMAN (Helgeson 1994), compiled and displayed all of the observations used, from both ASOS and MAPSO. After preliminary tests, the biggest concern was visibility. Therefore, an elaborate study was done on that particular category. This effort covered seven months, from December 1994 through June 1995.

Another study was done on the temperatures, dew points, ceilings, wind speeds, altimeter settings, and precipitation. This work involved the following five months: December 1994, January 1995, March 1995, May 1995, and June 1995. These five months were used to get a sampling from three different seasons.

Rochester was also involved in a National Field Demonstration regarding a test of the freezing rain sensor and software. This test was initiated in December 1994, and concluded in April 1995.

VISIBILITY STUDY

Preliminary tests discovered that there were some differences in visibilities measured by ASOS and those estimated by the human observers. The elevations at which the observations were taken are relatively close. Visibility is measured by ASOS at an elevation of 1304 feet above MSL (Mean Sea Level), estimated by the National Weather Service observer at 1320 feet above MSL, and estimated by the control tower personnel at an elevation of 1355 feet above MSL. Note that the control tower personnel estimate the visibility only when the prevailing visibility, as determined by either the National Weather Service personnel or control tower personnel, is less than four miles.

ASOS visibilities from each hourly observation, of every day, were compared, with the human observations used as a standard. Twenty-two visibility categories were created, ranging from 10 miles or greater, down to 1/4 of a mile or less. The 10 miles or greater, and the 1/4 of a mile or less groups were used because ASOS cannot "see" higher than 10 miles, or less than 1/4 of a mile (NOAA 1992). Therefore, in this study, a human visibility of 15 miles was reported as 10 miles, and a visibility of 1/16 of a mile was counted as 1/4 of a mile.

The ASOS visibilities were averaged by month, and the total average was computed as well. The highest discrepancy between the total average and the standard was 2.33 miles (Figure 2). This was found at the three-mile reporting value. The rest of the greatest discrepancies were found in the "middle" reporting values, between four miles and one and three-quarter miles. The differences on the high and low end of the spectrum were surprisingly small. The lowest discrepancies occurred at eight miles and three-quarters of a mile, with the difference, or error, of approximately one-half mile for each reporting value.

Figure 2. ASOS monthly average visibilities in miles for the period December 1994 - June 1995. The "Error in Miles" column represents ASOS visibility minus visibility as estimated by human observers.

Another interesting discovery was that visibilities reported by ASOS averaged higher than the human observations except three of the reporting values. The three groups that were lower were the 8 mile, 9 mile, and 10 mile or greater reporting values (Figure 3).

Figure 3. Difference between ASOS and human observed visibilities.

Many monthly totals came out with high discrepancies in some reporting values. This was because of the low number of observations, in a given category, taken during that period. However, by totaling the months together, a better representation of each reporting category was given.

TEMPERATURE

Temperature measurements from ASOS were also compared to those from the HO83 hygrothermometer (Figure 4). Both of the sensors are located on the centerfield, at the intersection of the two runways. The area surrounding the sensors is approximately the same for elevation and terrain. Over the five-month study, the temperatures came out very close. The highest discrepancy for the overall average was in June 1995, which had a difference of .16° Fahrenheit (°F). The lowest was in March 1995, with a value of .03°F. All of the total averages were less than the values reported by the HO83. The overall bias is similar to that found in Nouhan (1995).

The ASOS handled the temperature readings fairly well in most of the monthly averages as well. The main discrepancy was in May and June for temperatures greater than 70°F. The ASOS sensor reported values that averaged almost .9°F below from what the HO83 reported for those two months. Granted, a discrepancy of 1°F is still tolerable, since the calibration tolerance for the H083 is +2°F (DOC 1985).

The climatological high temperature in Rochester averages 68°F in May, and nearly 78°F in June. The number of temperature readings above 70°F in May is not as large as that of June, hence the larger error in May.

Likewise, another problem area was in December and January, for temperatures between 35°and 70°F. The ASOS sensor readings averaged almost .5°F lower than the HO83. This mainly would be because there were not many temperature readings that got above 35°F during those two months. The climatological average high temperature for Rochester in December is only 25°F, while January averages 20°F climatologically.

Figure 4. Difference between ASOS and H083 temperature readings. T is temperature in degrees Fahrenheit as based on H083 readings.

DEW POINTS

The dew points were also compared, using the same method as the temperatures. However, the ASOS did not do as well with the dew points as with the temperatures. The accuracy of the sensor decreased with time, based on the overall average dew points evaluated. The sensor started reading almost 1°F too high in December, then climbed to an astonishing 3°F too high by June (Figure 5). This was recorded in a log kept to note deficiencies, and the ASOS technician was informed. The sensor was replaced and the error corrected.

Investigating monthly dew point averages, most ranged from 1° to 3° too high. The lowest error came in December, for dew points less than 0°F. The average error that month was .42 . The largest difference was in June, which had an average error of 6.5° for dew points higher than 70°F. This may have been due to the small sample size, as there were only eight observations out of a possible 720, with dew points higher than 70°F that month. However, to have an average error of almost 7°F is intolerable.

Figure 5. Difference between ASOS and HO83 dew points in degrees Fahrenheit. D is Dew point in degrees Fahrenheit as determined by the H083.

WIND SPEEDS

ASOS did very well measuring the wind speed, as compared to the F420C wind system recorded by the human observers. Overall, the error was less than one knot. Generally the ASOS wind speeds were less than those measured by the F420C wind system (Figure 6). The differences ranged from 0.1 knots in May, to -0.73 knots in January, both of which are well within tolerance. The wind speeds were stratified into two categories: less than 15 knots, and 15 to 30 knots. Again, ASOS was remarkably close (within one knot), except June in the 15 to 30 knot category. The error was over two knots, which was quite large when compared to the rest of the differences.

Figure 6. Difference between ASOS and F420C wind system readings in knots. W is wind speed in knots as determined by the F420C.

BAROMETRIC (OR ATMOSPHERIC) PRESSURE

To the aviation community, the altimeter setting is a very important part of the observation. The pressure sensor on ASOS did a very good job. The average error was only -.003 of an inch of mercury (Figure 7). This is well within the standards for allowable reporting of the altimeter setting. The lowest error was a +0 inches of mercury in June, and the highest was -.006 inches of mercury in January.

Figure 7. Differences in altimeter setting in inches of Mercury, between ASOS and the altimeter at the National Weather Service, Rochester, MN.

Another vital element of the observation for the aviation community, if not the most important, is the height of the ceiling. A ceiling, defined by the National Weather Service (DOC 1994), is "the height above ground level ascribed to the lowest opaque broken or overcast layer aloft, or the vertical visibility in a surface-based layer of obscuring phenomena that hides 10/10 of the sky". In laymen's terms, a ceiling is the height above the ground of the base of the lowest layer of clouds, when over half of the sky is covered. Pilots need to know the height of the clouds to be able to observe the runway when landing. Air traffic controllers need to know ceiling heights as well, because they have to direct and reroute air traffic when landing and takeoff "minimums" are met.

The performance of ASOS versus the values taken from a ceilometer was relatively good. For ceilings less than 1000 feet, the largest average error was 463 feet in June, while the lowest was 5 feet in March (Figure 8). For ceilings between 1000 and 3000 feet, the highest average error was 185 feet in March, while December had the lowest average error at 15 feet. At ceilings greater than 3000 feet, the differences got higher, but the reportable values became more distant at the same time. June had an average error of 2147 feet, which is not great, but is tolerable nonetheless, especially when the ceilings are very high in the sky. The lowest error was in December, which had an average error of 100 feet.

Figure 8. Difference in feet between reported ceilings measured by ASOS and the National Weather Service ceilometer. C is ceiling in feet above ground level.

PRECIPITATION

The ASOS at Rochester has still not received a tipping bucket, so amounts have not been calculated. Otherwise, ASOS did well as far as determining whether or not precipitation was falling, the type of precipitation, and whether it falsely detected precipitation.

When precipitation was falling, ASOS detected it 91 percent of the time. During the test, ASOS was too slow in initially detecting precipitation, and generally ended the precipitation sooner than the human observers. This however, is not surprising; humans can actually see precipitation falling, while the ASOS must have precipitation fall on, or through its sensors to detect it. This is exacerbated during very light precipitation events. Similarly, ASOS falsely detected precipitation only 1.23 percent of the time. This could have been due to the sensors picking up or ending the precipitation before the observer noticed the occurrence of the precipitation.

Precipitation type was also detected with extreme accuracy. In fact, the sensors detected the correct type of precipitation 99.9 percent of the time. This is crucial, especially during mixed precipitation events. This played a big part during a test of the freezing rain sensor and software done by Donald Cameron, of the National Weather Service's Special Operations Office (DOC 1996). During the test from December 1994 to April 1995, there were ten freezing rain events. ASOS picked up the freezing rain on five of those events, and reported other types of precipitation on four others. Quite possibly, the 0.1 percent of events that the ASOS did not correctly identify the precipitation type may have been due to the precipitation type changing, and the temperature being near freezing.

CONCLUSION

Although the ASOS at Rochester had some isolated problems, it performed with as much accuracy as desired. A malfunctioning dew point sensor was found during this study, and was corrected. During preliminary tests and the period covered by this study, the most persistent problem involved the visibility measurements. By knowing that the visibilities around three miles had the greatest error, visibility users can take that into consideration for aviation purposes. Documentation of ASOS's weaknesses should help technicians and engineers focus on what needs to be improved.

Meteorologists greatly value data obtained from surface observations. The development of ASOS has increased the number of observations manyfold. These observations can improve weather forecasting and provide vital information to the aviation user community, provided the strengths, weaknesses, and limitations of ASOS are understood.

ACKNOWLEDGEMENTS

Thanks to the staff at WSO Rochester for providing information and keeping daily logs to determine the nuances of ASOS. A special thank you goes to Richard Naistat, SOO at WSFO Twin Cities/Chanhassen, for all of his time, effort, and patience. His comments and expertise saved time and was much appreciated.

REFERENCES

Dept of Commerce, 1996: Automated Surface Observing System - Freezing Rain Sensor/Software Operational Test and Evaluation Summary. NOAA, NWS, OSO/Systems Integration Division/ Field Systems Branch, Silver Spring, MD. 12 pp.

____________, 1994: National Weather Service Observing Handbook No. 7, Surface Observations. Government Printing Office (GSA), 7-1.

____________, 1985: National Weather Service Engineering Handbook No. 8, Surface Equipment. Government Printing Office (GSA), 1-4.

Helgeson, E., 1994: ASOSMAN ASOS-Manual Observation Comparison. DOC/NOAA/NWS Central Region, Computer Programs and Problems No.-13MC, 17pp.

NOAA, FAA, and US Navy, 1992: Automated Surface Observing System - User's Guide, Government Printing Office (GSA), June, 57pp.

Nouhan, V.J., 1995: Warm Season Comparisons Between The ASOS and Old H083 Temperature Sensors at Goodland, Kansas. Central Region Applied Research Paper 15-09. DOC/NOAA/NWS, Central Region Headquarters, Scientific Services Division, Kansas City, MO.

 


USA.gov is the U.S. government's official web portal to all federal, state and local government web resources and services.