Automated Surface Observing System (ASOS) Performance During a Severe Weather Event: A Case Study

Makoto E. Moore
National Weather Service Office
Lexington, Kentucky


Routine surface weather observations are taken manually at 950 sites across the United States, and are operated by the National Weather Service (NWS), the Federal Aviation Administration (FAA), and the Department of Defense (DOD). The job of taking observations can become very time-consuming, especially during severe weather when rapidly changing conditions often require frequent observational updates. It is during severe weather when NWS forecasters' attention must be steered toward the analysis of the convective environment, and the issuance of watches, warnings, and statements.

The Automated Surface Observing System (ASOS) has been designed to be dedicated to the task of taking surface observations in any weather environment. It provides data on barometric pressure, temperature, dew point, wind direction and speed, visibility, cloud cover up to 12,000 feet AGL, and precipitation type, intensity, and amount. The ASOS system has been scheduled for implementation in more than 500 observing sites, with contract options for over 800 more units (U.S. Dept. of Commerce 1992).

The purpose of this paper is to compare ASOS-generated and manual observations during a severe weather event to test whether ASOS can provide timely, accurate observational information. If ASOS can provide high quality observations, the time saved would allow weather observers (Hydrometeorological Technicians (HMTs) and Meteorologist Interns) the opportunity to concentrate on the adverse weather here. The manual data were recorded by a trained staff at WSO Lexington, Kentucky. During this event, ASOS was left unattended and was not augmented in any way to gauge its independent performance.

It should be noted that manual observations are subject to human error and cannot be construed as absolutely correct 100 percent of the time. The observations taken by the staff at Lexington are offered as a measure only, since the two observational systems use differing processes when reporting the weather. For the "objective" elements such as pressure, air temperature, dew point, and wind, both ASOS and the observer use a fixed location, time-averaging technique. In fact, instruments for both ASOS and manual observations for these elements reside at the same location (Runway #4) at Bluegrass Airport. For the "subjective" elements, however, such as sky condition, visibility, and present weather, observers use a fixed time, spatial-averaging technique. ASOS, on the other hand, uses a fixed location, time-averaging technique (ASOS Users Guide 1992).

It is also very important to note that although the ASOS system for this study was deemed operationally-sufficient, it was still a noncommissioned "test" system. Upon commissioning, an ASOS system would be monitored 24 hours daily by trained staff. This staff would augment for sensor malfunctions and weather parameters that the system currently cannot detect, such as thunder, hail, freezing rain, tornadoes, virga, and volcanic ash (ASOS Guide for Pilots 1993).

This paper contains a brief description of the severe weather scenario and a comparison of the two sets of observations. Conclusions of ASOS performance will be drawn from the similarities and differences of this comparison.


On April 15, 1994, a strong cold front moved across Kentucky and spawned a squall line of intense thunderstorms with embedded small tornado touchdowns. Two tornado watches were issued that morning, which covered an extensive portion of Kentucky, southern Indiana, southern Ohio, western Pennsylvania, and West Virginia. This area included the 22 counties that comprised WSO Lexington's County Warning Area (CWA) at that time. Severe Thunderstorm Warnings for 15 counties were issued by the weather office in Lexington, including 1 Tornado Warning. Of the 16 counties warned, 12 were verified with numerous reports of large hail (3/4" or larger) and damaging winds (equal to or greater than 50 knots). The numerous storms of this event also produced torrential rainfall, frequent lightning, and spotted funnel clouds.


WSO Lexington is located at Bluegrass Airport, on the western end of Lexington. Although storms raged throughout Central Kentucky, the observations used in this comparison must reflect the time when the severe weather directly affected the airport. Thunder was heard by the staff beginning at 1108 a.m. EST as a line of intense storms drove through the area. The first wave of storms ended at 1241 p.m. EST. A second, smaller line of storms then moved through the immediate vicinity, with thunder reported from 131 p.m. EST to 244 p.m. EST. As a result, the period of observations examined in this study was from 1050 a.m. EST to 250 p.m. EST (Figure 1).

One difference in the observations can occur when comparing cloud coverage and cloud heights. While manual observations rely on a Laser Beam Ceilometer (LBC) to report cloud heights, the ASOS system uses a twofold method. ASOS also uses a LBC, but then inserts these data into an algorithm that averages the cloud coverage over time (Clark 1994). Cloud heights reported in the two sets of observations corresponded well throughout this event. However, sky coverage varied slightly for the first 30-40 minutes of the comparison, i.e., the first five ASOS observations. Despite the differences, both sets of observations reported a ceiling. One notable difference was that on the 1056 EST observation when the sky was overcast, ASOS reported a scattered reading (Figure 1).

A few discrepancies showed up between the two observational systems when comparing visibility and present weather data. Immediately noticeable is that ASOS did not report thunderstorms. Much work has been done to develop an independent LDS (Lightning Detection Sensor) system that would work hand-in-hand with radar data. This would allow ASOS to provide an integrated radar-LDS thunderstorm location and intensity reporting product. Until that time, ASOS must be augmented for thunder. Also missing was the mention of fog, which was reported in five of the manual observations. Reported rain intensities from both systems were fairly similar throughout the comparison, and did not pose any problems. In fact, the ASOS rain gauge worked superbly and recorded the exact daily total that the manual gauge noted. A crucial difference appeared, however, when visibilities were compared. The ASOS visibility sensors have been constructed to be accurate to, and have an allowable difference of, +/- 1 mile for visibilities up to 4 miles, and +/- 2 miles for visibilities 5 miles or greater. The comparison showed substantial variations in eight observations, more than 50 percent of the time. These variations surpassed the allowable differences, and on the 1256 EST observation ASOS visibility was off by 5 miles. Some of these differences may have been due to rapidly changing weather conditions, and manual versus ASOS observations not recorded at the same exact time.

When comparing the remaining variables of temperature, dew point temperature, wind speed and direction, and altimeter reading, both systems corresponded fairly well, besides a few small problems. Altimeter readings were basically identical. Both wind speeds and directions were fairly consistent. Ambient air temperatures were also similar during the study. However, a malfunctioning dew point temperature sensor on ASOS resulted in large differences between the ASOS and manual dew point readings. Manual observations recorded a dew point of 65°F before the thunderstorms, and a dew point of 56°F - 57°F once the rainfall began. ASOS reported a consistent dew point in the 28°F - 33°F range throughout the event (Figure 1). This caused a false relative humidity to be displayed, and is the reason fog was not reported in the present weather section of the ASOS observations. If the ASOS system used in this study had been augmented, the problem of the defective sensor would have been corrected. Upon commissioning, augmentation and quick maintenance would rectify this type of problem.

Finally, the Remarks Section from the two observation sets showed significant differences and variety. This is due in most part to the fact that a new, different set of rules has been conceived by the NWS, FAA, and DOD for the ASOS system. In this event, both sets faithfully reported peak winds (PK WND) and the beginning of precipitation (RBxx). In addition, ASOS displayed hourly cumulative precipitation (PCPN xxxx) at the end of every observation, and a jump in the pressure (PRJMP) which was missing from the manual remarks. However, the manual reports included runway visual range (RVR) data that ASOS did not. Storm and lightning location and movement information also failed to show up on the ASOS reports, since ASOS is unable to independently detect thunder and lightning.


The major difference between the two data sets is the process in which these observations are taken. ASOS has all of its observational sensors on Runway 4 at Bluegrass Airport. Runway 4 is the designated Touchdown Zone (TDZ), and is the best spot to serve the aviation community. The trained observer also uses sensors located on Runway 4 for the manual observations. In addition, visibility and sky coverage are determined manually from the roof of the main terminal building, which is located one mile northeast of Runway 4. However, where an observer can take a celestial view and report on the conditions in each quadrant of the sky, ASOS can only sample what is directly above its sensors. To address this problem, ASOS takes numerous observations, then time-averages its readings into an algorithm. ASOS receives sky condition data from its Cloud-Height Indicator every 30 seconds. Every minute it processes the previous 30 minutes of data to detect any changes in sky coverage. During rapidly changing weather conditions, this time-averaging technique can result in a reporting lag time of 2 to 10 minutes. If only a quick glance is afforded to an ASOS product, such as the Video Display Unit (VDU) or the 1 minute observation telephone line, the lag time could be unrepresentative of the actual weather conditions at a given location.

This study supports the current well-known need for occasional augmentation of the ASOS system by a trained staff, and this is one disadvantage of the system. Due to current lack of technology, ASOS is hindered from achieving its ultimate goal of being a standalone system. Until advances are made in the necessary sensors, ASOS cannot detect thunder, hail, and even has trouble with freezing rain. An observer is required to edit, or augment, ASOS when these events occur. To be truly a help during severe weather, ASOS needs to be able to independently record these weather phenomena. However, upon commissioning, a separate staff of observers would be present to augment ASOS. The problem of ASOS augmentation would then be transparent to forecasters busy with severe weather.

An advantage of the ASOS system is that its sole purpose is to take objective observations. It has no other job; therefore, ASOS is much more attentive to weather conditions than a human staff, especially during severe weather. In fact, since manual observations are subjective in nature with respect to visibility and sky coverage, there may be some error in this human element. In the four-hour span of this one event, ASOS generated 15 observations compared to the 13 manual observations, a 13 percent increase. In addition, ASOS provided extra information that was very helpful, including the pressure jump that the Lexington staff had overlooked, and the hourly precipitation data.


The purpose of this study was to determine if the ASOS system could perform well during a severe weather event. Remembering that the ASOS during this project was an uncommissioned system, the comparison of manual observations to ASOS-generated observations has shown that neither system is perfect. It has also pointed out a few system-specific differences. However, these differences should be seen as optional methods to achieving comparable results.

Due to the current state of technology, the process by which ASOS generates surface observations is necessary. With time, advanced software and hardware will be made available. These advancements will improve ASOS, helping it to become a standalone system and the future of weather observations.


I would like to thank Ted Funk, SOO at WSFO Louisville, and Sarah McLeod, DAPM at WSFO Louisville, for their many comments and suggestions during the numerous phases of this paper. Their help and guidance were greatly appreciated


Clark, P., 1994: Automated Surface Observations. U.S. Department of Commerce, NOAA, NWS, Government Printing Office, December, 5pp.

U.S. Dept. of Commerce, NOAA, 1993: ASOS Guide For Pilots. Government Printing Office, April, 16pp.

____________, 1992: ASOS Users Guide. Government Printing Office, June, 74pp.

____________, 1992: ASOS Fact Sheet. Government Printing Office. Office of Public Affairs, July.


Manual Observations


SA 1050 M33 BKN 65 OVC 7 067/77/65/2214/975
SP 1113 M30 BKN 65 OVC 1TRW- 3220G36/984/R04VR26V60+ CB MOVG E TB08
SP 1116 7 SCT M30 OVC 1/2TRW+ 3216G36/983/R04VR26V60+ CB MOVG NE FRQT LTGICCC PK WND 3236/09
SP 1125 6 SCT 17 SCT M30 OVC 1TRW+ 3110G20/986/R04VR35V60+ CB MOVG NE T ALQDS FRQT LTGICCC
RS 1150 M34 OVC 3TRW-F 102/57/56/3308G16/984/RB05 TB08 T ALQDS MOVG E OCNL LTGCGICCC PRESRR
SP 1243 M36 BKN 65 OVC 5RW-F 2004/985/TE41 MOVD E-NE
SA 1250 M36 BKN 65 BKN 90 OVC 5RW-F 105/58/57/1908/985/TE41 MOVD E-NE
SP 1332 M75 OVC 6TRW- 2008/988/TB31 T ALQDS
RS 1350 M33 BKN 75 OVC 3TRW 114/58/57/1909/988/TB31 T ALQDS
SP 1356 M33 BKN 75 OVC 2TRW 1911/989
SP 1444 30 SCT M80 OVC 3RW- 2012/987/TE44
SA 1450 30 SCT M80 OVC 3RW-F 109/58/57/1913/986/TE44


ASOS Observations


TA 1056 35 SCT 65 SCT 10+ 072/75/30/1915G25/976
TP 1114 10 SCT 28 SCT M36 BKN 3R- 102/59/30/3123Q36/984/RB08 PCPN 0008 PK WND 3236/1107 PRESRR
TP 1116 8 SCT 26 SCT M34 BKN 13/4R+ 099/58/30/3115Q36/983/RB08 PCPN 0017 PK WND 3236/1107 PRESRR
TP 1120 6 SCT 18 SCT M33 BKN 1R+ 104/56/29/3016G36/985/RB08 PCPN 0032 PK WND 3236/1107 PRESRR
TP 1123 6 SCT M14 BKN 34 BKN 1R+ 106/56/29/2914G21/985/ BKN V SCT RB08 PCPN 0035 PK WND 3236/1107 PRJMP 7/1101/1112
TP 1127 6 SCT M14 BKN 100 OVC 11/2R+ 109/56/29/3009G21/986/RB08 PCPN 0042 PK WND 3236/1107 PRJMP 7/1101/1112
TP 1136 8 SCT M22 BKN 33 OVC 3R 108/56/28/3107/986/RB08 PCPN 0048 PK WND 3236/1107 PRJMP 7/1101/1112
TP 1148 M34 OVC 7R 096/57/30/3409/982/RB08 PCPN 0050 PK WND 3236/1107 PRJMP 7/1101/1112
TA 1156 M34 OVC 4R+ 107/57/31/2907/986/RB08 PCPN 0053 PK WND 3236/1107 PRJMP 7/1101/1112
TP 1227 M34 OVC 21/2R 104/57/29/0000/985/PCPN 0009 PRJMP 4/1150/1159
TP 1237 M38 OVC 4R 098/57/32/0000/983/PCPN 0011 PRJMP 4/1150/1159
TA 1256 M40 BKN 90 OVC 10+R- 104/57/30/1508/985/BKN V SCT PCPN 0012 PRJMP 4/1150/1159
TS 1356 M26 BKN 75 OVC 3R+ 116/57/31/1512/989/PCPN 0017 PRESRR
TP 1438 M30 BKN 80 OVC 5R 111/57/33/1512/987/PCPN 0016
TA 1456 M32 BKN 80 OVC 5R 112/57/33/1512/987/PCPN 0020

Figure 1. A listing of manual and ASOS-generated observations taken during a severe weather event on 15 April 1994. T before the ASOS observation type refers to the system being in test mode, and TA denotes a test record observation, TP a test special observation, and TS a test record special observation.


USA.gov is the U.S. government's official web portal to all federal, state and local government web resources and services.