Makoto E. Moore
National Weather Service Office
Routine surface weather observations are taken manually at 950 sites across the United States, and are operated by the National Weather Service (NWS), the Federal Aviation Administration (FAA), and the Department of Defense (DOD). The job of taking observations can become very time-consuming, especially during severe weather when rapidly changing conditions often require frequent observational updates. It is during severe weather when NWS forecasters' attention must be steered toward the analysis of the convective environment, and the issuance of watches, warnings, and statements.
The Automated Surface Observing System (ASOS) has been designed to be dedicated to the task of taking surface observations in any weather environment. It provides data on barometric pressure, temperature, dew point, wind direction and speed, visibility, cloud cover up to 12,000 feet AGL, and precipitation type, intensity, and amount. The ASOS system has been scheduled for implementation in more than 500 observing sites, with contract options for over 800 more units (U.S. Dept. of Commerce 1992).
The purpose of this paper is to compare ASOS-generated and manual observations during a severe weather event to test whether ASOS can provide timely, accurate observational information. If ASOS can provide high quality observations, the time saved would allow weather observers (Hydrometeorological Technicians (HMTs) and Meteorologist Interns) the opportunity to concentrate on the adverse weather here. The manual data were recorded by a trained staff at WSO Lexington, Kentucky. During this event, ASOS was left unattended and was not augmented in any way to gauge its independent performance.
It should be noted that manual observations are subject to human error and cannot be construed as absolutely correct 100 percent of the time. The observations taken by the staff at Lexington are offered as a measure only, since the two observational systems use differing processes when reporting the weather. For the "objective" elements such as pressure, air temperature, dew point, and wind, both ASOS and the observer use a fixed location, time-averaging technique. In fact, instruments for both ASOS and manual observations for these elements reside at the same location (Runway #4) at Bluegrass Airport. For the "subjective" elements, however, such as sky condition, visibility, and present weather, observers use a fixed time, spatial-averaging technique. ASOS, on the other hand, uses a fixed location, time-averaging technique (ASOS Users Guide 1992).
It is also very important to note that although the ASOS system for this study was deemed operationally-sufficient, it was still a noncommissioned "test" system. Upon commissioning, an ASOS system would be monitored 24 hours daily by trained staff. This staff would augment for sensor malfunctions and weather parameters that the system currently cannot detect, such as thunder, hail, freezing rain, tornadoes, virga, and volcanic ash (ASOS Guide for Pilots 1993).
This paper contains a brief description of the severe weather scenario and a comparison of the two sets of observations. Conclusions of ASOS performance will be drawn from the similarities and differences of this comparison.
On April 15, 1994, a strong cold front moved across Kentucky and spawned a squall line of intense thunderstorms with embedded small tornado touchdowns. Two tornado watches were issued that morning, which covered an extensive portion of Kentucky, southern Indiana, southern Ohio, western Pennsylvania, and West Virginia. This area included the 22 counties that comprised WSO Lexington's County Warning Area (CWA) at that time. Severe Thunderstorm Warnings for 15 counties were issued by the weather office in Lexington, including 1 Tornado Warning. Of the 16 counties warned, 12 were verified with numerous reports of large hail (3/4" or larger) and damaging winds (equal to or greater than 50 knots). The numerous storms of this event also produced torrential rainfall, frequent lightning, and spotted funnel clouds.
MANUAL VERSUS ASOS OBSERVATIONS
WSO Lexington is located at Bluegrass Airport, on the western end of Lexington. Although storms raged throughout Central Kentucky, the observations used in this comparison must reflect the time when the severe weather directly affected the airport. Thunder was heard by the staff beginning at 1108 a.m. EST as a line of intense storms drove through the area. The first wave of storms ended at 1241 p.m. EST. A second, smaller line of storms then moved through the immediate vicinity, with thunder reported from 131 p.m. EST to 244 p.m. EST. As a result, the period of observations examined in this study was from 1050 a.m. EST to 250 p.m. EST (Figure 1).
One difference in the observations can occur when comparing cloud coverage and cloud heights. While manual observations rely on a Laser Beam Ceilometer (LBC) to report cloud heights, the ASOS system uses a twofold method. ASOS also uses a LBC, but then inserts these data into an algorithm that averages the cloud coverage over time (Clark 1994). Cloud heights reported in the two sets of observations corresponded well throughout this event. However, sky coverage varied slightly for the first 30-40 minutes of the comparison, i.e., the first five ASOS observations. Despite the differences, both sets of observations reported a ceiling. One notable difference was that on the 1056 EST observation when the sky was overcast, ASOS reported a scattered reading (Figure 1).
A few discrepancies showed up between the two observational systems when comparing visibility and present weather data. Immediately noticeable is that ASOS did not report thunderstorms. Much work has been done to develop an independent LDS (Lightning Detection Sensor) system that would work hand-in-hand with radar data. This would allow ASOS to provide an integrated radar-LDS thunderstorm location and intensity reporting product. Until that time, ASOS must be augmented for thunder. Also missing was the mention of fog, which was reported in five of the manual observations. Reported rain intensities from both systems were fairly similar throughout the comparison, and did not pose any problems. In fact, the ASOS rain gauge worked superbly and recorded the exact daily total that the manual gauge noted. A crucial difference appeared, however, when visibilities were compared. The ASOS visibility sensors have been constructed to be accurate to, and have an allowable difference of, +/- 1 mile for visibilities up to 4 miles, and +/- 2 miles for visibilities 5 miles or greater. The comparison showed substantial variations in eight observations, more than 50 percent of the time. These variations surpassed the allowable differences, and on the 1256 EST observation ASOS visibility was off by 5 miles. Some of these differences may have been due to rapidly changing weather conditions, and manual versus ASOS observations not recorded at the same exact time.
When comparing the remaining variables of temperature, dew point temperature, wind speed and direction, and altimeter reading, both systems corresponded fairly well, besides a few small problems. Altimeter readings were basically identical. Both wind speeds and directions were fairly consistent. Ambient air temperatures were also similar during the study. However, a malfunctioning dew point temperature sensor on ASOS resulted in large differences between the ASOS and manual dew point readings. Manual observations recorded a dew point of 65°F before the thunderstorms, and a dew point of 56°F - 57°F once the rainfall began. ASOS reported a consistent dew point in the 28°F - 33°F range throughout the event (Figure 1). This caused a false relative humidity to be displayed, and is the reason fog was not reported in the present weather section of the ASOS observations. If the ASOS system used in this study had been augmented, the problem of the defective sensor would have been corrected. Upon commissioning, augmentation and quick maintenance would rectify this type of problem.
Finally, the Remarks Section from the two observation sets showed significant differences and variety. This is due in most part to the fact that a new, different set of rules has been conceived by the NWS, FAA, and DOD for the ASOS system. In this event, both sets faithfully reported peak winds (PK WND) and the beginning of precipitation (RBxx). In addition, ASOS displayed hourly cumulative precipitation (PCPN xxxx) at the end of every observation, and a jump in the pressure (PRJMP) which was missing from the manual remarks. However, the manual reports included runway visual range (RVR) data that ASOS did not. Storm and lightning location and movement information also failed to show up on the ASOS reports, since ASOS is unable to independently detect thunder and lightning.
The major difference between the two data sets is the process in which these observations are taken. ASOS has all of its observational sensors on Runway 4 at Bluegrass Airport. Runway 4 is the designated Touchdown Zone (TDZ), and is the best spot to serve the aviation community. The trained observer also uses sensors located on Runway 4 for the manual observations. In addition, visibility and sky coverage are determined manually from the roof of the main terminal building, which is located one mile northeast of Runway 4. However, where an observer can take a celestial view and report on the conditions in each quadrant of the sky, ASOS can only sample what is directly above its sensors. To address this problem, ASOS takes numerous observations, then time-averages its readings into an algorithm. ASOS receives sky condition data from its Cloud-Height Indicator every 30 seconds. Every minute it processes the previous 30 minutes of data to detect any changes in sky coverage. During rapidly changing weather conditions, this time-averaging technique can result in a reporting lag time of 2 to 10 minutes. If only a quick glance is afforded to an ASOS product, such as the Video Display Unit (VDU) or the 1 minute observation telephone line, the lag time could be unrepresentative of the actual weather conditions at a given location.
This study supports the current well-known need for occasional augmentation of the ASOS system by a trained staff, and this is one disadvantage of the system. Due to current lack of technology, ASOS is hindered from achieving its ultimate goal of being a standalone system. Until advances are made in the necessary sensors, ASOS cannot detect thunder, hail, and even has trouble with freezing rain. An observer is required to edit, or augment, ASOS when these events occur. To be truly a help during severe weather, ASOS needs to be able to independently record these weather phenomena. However, upon commissioning, a separate staff of observers would be present to augment ASOS. The problem of ASOS augmentation would then be transparent to forecasters busy with severe weather.
An advantage of the ASOS system is that its sole purpose is to take objective observations. It has no other job; therefore, ASOS is much more attentive to weather conditions than a human staff, especially during severe weather. In fact, since manual observations are subjective in nature with respect to visibility and sky coverage, there may be some error in this human element. In the four-hour span of this one event, ASOS generated 15 observations compared to the 13 manual observations, a 13 percent increase. In addition, ASOS provided extra information that was very helpful, including the pressure jump that the Lexington staff had overlooked, and the hourly precipitation data.
The purpose of this study was to determine if the ASOS system could perform well during a severe weather event. Remembering that the ASOS during this project was an uncommissioned system, the comparison of manual observations to ASOS-generated observations has shown that neither system is perfect. It has also pointed out a few system-specific differences. However, these differences should be seen as optional methods to achieving comparable results.
Due to the current state of technology, the process by which ASOS generates surface observations is necessary. With time, advanced software and hardware will be made available. These advancements will improve ASOS, helping it to become a standalone system and the future of weather observations.
I would like to thank Ted Funk, SOO at WSFO Louisville, and Sarah McLeod, DAPM at WSFO Louisville, for their many comments and suggestions during the numerous phases of this paper. Their help and guidance were greatly appreciated
Clark, P., 1994: Automated Surface Observations. U.S. Department of Commerce, NOAA, NWS, Government Printing Office, December, 5pp.
U.S. Dept. of Commerce, NOAA, 1993: ASOS Guide For Pilots. Government Printing Office, April, 16pp.
____________, 1992: ASOS Users Guide. Government Printing Office, June, 74pp.
____________, 1992: ASOS Fact Sheet. Government Printing Office. Office of Public Affairs, July.
Figure 1. A listing of manual and ASOS-generated observations taken during a severe weather event on 15 April 1994. T before the ASOS observation type refers to the system being in test mode, and TA denotes a test record observation, TP a test special observation, and TS a test record special observation.