NOAA Weather Radio Feedback for the
NWS-La Crosse Listening Area

(conducted December 2002)

Todd C. Rieck
Meteorologist
National Weather Service
La Crosse, WI

I. Introduction

In December 2002, the National Weather Service (NWS) in La Crosse, WI requested feedback from listeners of the 8 NOAA Weather Radio (NWR) transmitters in their service area (the Decorah site was not in operation at the time of this request). Since a similar feedback effort in 2000, La Crosse has added 4 new transmitters, while transferring broadcast responsibility of WWF-40 in central Wisconsin to the Green Bay Weather Office. The listeners could submit their opinions on the programming through an online form located on the NWS-La Crosse's website, or via regular mail.

There were several objectives to this project, including: 1) to ascertain the quality of the broadcasts and the products themselves, 2) compare these results to a similar feedback effort conducted in 2000, 3) determine how understandable the Console Replacement System's (CRS) computerized voices were, 4) and to acquire background information on who the main listeners of the broadcasts are. The listeners were asked to fill out an extensive form which was broken up into 4 main sections: background information, routine programming, severe weather programming, and other (miscellaneous) questions.

The response was good. A total of 138 listeners responded (Table 1), with the vast majority using the feedback interface provided on the webpage (120 listeners). The Winona transmitter slaves off of La Crosse (uses the same feed) and the Richland Center transmitter slaves off of Prairie du Chien. For the purpose of this paper, their results will be combined with the transmitter they slave off of.



WXJ-86/

KGG-95

WXK-41 WWG-564 KXI-68 WWG-86/

WWG-89

KZZ-77 Total
60 58 7 6 6 1 138

Table 1. Number of responses per transmitter.


The author acknowledges that there are limitations to a feedback endeavor such as this. However, the effort was focused on gathering as much listener response as possible. It is not meant to be a scientific study, but rather a fact gathering effort to ascertain the user's opinions and impressions on those broadcasts. Our ability to get the most up-to-date and relevant weather information out to the public is one of the highest priorities of the NWS. Information obtained through these efforts will enable the NWS La Crosse to move forward in this goal.


II. Results

A. Background Information

Background information on our listeners was collected, to get a better understanding of who our main listeners were.

A wide variety of listeners responded, with varying occupations and ages. The median age was 40-54 years old range (Chart 1). Occupational responses were varied, ranging from Pastors, to River towboat captains, to retired.

Chart 1. Age of listeners.



As for their listening habits, the vast majority answered they listen 1-3 times per day (Chart 2). There was no clear cut time when they listened most often, but in general, during severe weather, the 4-8 AM, and evening (4 PM - midnight) time slots were favored (Table 2).

Chart 2. Times they listen per day.

4-8 AM 8 AM-noon noon-4 PM 4-8 PM 8 PM - midnight midnight - 4 AM severe
62% 33% 20% 40% 44% 9% 56%

Table 2. Most common listening times (percentage).

(Listeners could indicate more than one listening time, so percentages will not equal 100%)

B. Routine Programming

Overall, the NWS La Crosse's listeners were very pleased with the overall content and quality of the weather radio broadcasts during "normal" (non-severe) conditions. The program length is kept in the 4-5 minute range as often as possible (the time it takes to go through one entire cycle of programming), and that was thought to be "just right" in 83% of the responses.

The listeners indicated that much of the programming was relevant to them, with the climate summaries and daily river products the least used. The Hazardous Weather Outlook (HWO) was popular, even more than the hourly conditions. Considering this product was originally intended for area spotters and emergency managers, this response indicated it was an important product to nearly all the listeners, and not just a predefined segment (Table 3). Also of note is the popularity of the short term forecasts (NOW). All the respondents indicated that this product was at least used sometimes, with the majority saying they always used it. Not one response indicated that is was rarely, if ever used.


Products * 5=used always 4=used often 3=used sometimes 2=used rarely 1=never used Average

Rating

AWS 43% 27% 21% 7% 2% 4.0
LFP 78% 17% 4% 1% 1% 4.7
HRR 44% 26% 21% 7% 4% 4.0
CLI** 17% 19% 29% 24% 11% 3.1
NOW 62% 21% 16% 0% 0% 4.5
RVA*** 5% 11% 16% 29% 39% 2.1
RNS 30% 38% 17% 10% 4% 3.8
HWO 58% 26% 13% 3% 1% 4.4

Table 3. Usage of the routine products, graded on a 1-5 to scale.

* a listing of product abbreviations can be found in appendix A.

**KXI-68, KZZ-77, KGG-95, WWG-86/WWG-89 transmitters only broadcast abbreviated climate summaries for the surrounding area. WXJ-86/WNG-56 and WXK-41 broadcast more detailed summaries.

*** Only WXJ-86/WNG-56 and WWG-86/WWG-89 broadcast daily river information.

When examining the quality of the products, the listeners ranked most as very good to excellent. The one product that received the least favorable marks was the extended portion of the forecast. While its rating averaged out into the good category, it received many fair to poor quality responses (Table 4), and has the least amount of "excellent" responses.

Products * 5=excellent 4=very good 3=good 2=fair 1=poor Average Rating
AWS 50% 35% 13% 2% 0% 4.3
CLI** 41% 34% 17% 6% 1% 4.1
RVA*** 38% 20% 26% 11% 5% 3.7
RNS 33% 43% 17% 4% 2% 4.0
LFP 40% 43% 15% 2% 0% 4.2
Extended 22% 32% 31% 10% 5% 3.6
NOW 52% 38% 10% 0% 0% 4.4
HWO 43% 40% 14% 1% 1% 4.2

Table 4. Quality of the routine products, graded on a 1-5 to scale.

a listing of product abbreviations can be found in appendix A.

**KXI-68, KZZ-77, KGG-95, WWG-86/WWG-89 transmitters only broadcast abbreviated climate summaries for the surrounding area. WXJ-86/WNG-56 and WXK-41 broadcast more detailed summaries.

*** Only WXJ-86/WNG-56 and WWG-86/WWG-89 broadcast daily river information.

A couple questions were asked specifically about the short term forecasts (NOWcasts). The listeners indicated that this was a useful product for them, and the more information it could provide, the better.

C. Severe Weather Programming

The responses to the NWS La Crosse's severe weather programming were very similar to those of the routine programming. Most respondents found the program length to be "just right" (86%), and 78% indicated that they were very satisfied with the severe weather programming.

As would be expected from feedback for weather radios, the majority of the listeners indicated that NWR was their main source for severe weather information (75%). The next two were the internet at 10% and local TV at 8%.

The quality of the severe broadcasts were also thought to be very good to excellent by the listeners (Table 5), similar to their impressions of the normal programming.

Products * 5=excellent 4=very good 3=good 2=fair 1=poor Average Rating
Warnings 54% 42% 3% 1% 0% 4.5
SVS 41% 54% 4% 1% 0% 4.3
NOW 39% 52% 8% 1% 0% 4.3
RNS 34% 51% 14% 1% 0% 4.2

Table 5. Quality of the severe weather products, graded on a 1-5 to scale.

* a listing of product abbreviations can be found in appendix A.

D. Computerized (Synthetic) Voices

In the continuing effort to improve the quality of the automated voices used in all the NWR broadcasts, the La Crosse office implemented the Voice Improvement Processor (VIP) early in 2002. This replaced the previous automated voice used with CRS, with more "realistic", computer generated female and male voices.

Nearly every respondent (97%) indicated they could understand the automated voices, although some did comment on voice quality. Next, the users indicated which voice they preferred, if any (Chart 3). The "new" (VIP) male was preferred, but only by a small margin, with no clear majority between any of the voices. However, the new voices overall (male and female), were preferred by over half of the responders. In addition, there were many comments that listeners liked to hear a human voice on occasion, especially during severe weather. They wished the office would continue to use this human interaction during these events.




Chart 3. Preferred NWR computer generated voice.

(The "New" refers to the voices using the VIP, while the "Old" is referencing the original CRS voice)

E. Other/Miscellaneous Questions

Most of the questions in this section focused on fact gathering; acquiring more information on how NWR is used.


III. Comparison to Previous Feedback

In the fall of 2000, a nearly identical feedback form was completed by area listeners. The hope was that information from this 2002 effort, when compared with feedback from 2000, would show areas of improvement, or indicate where improvement was needed. It could also show changes in listener-ship, or their needs.

In the tables that follow are the results listed previously in the document, along with corresponding results from the 2000 information. The 2002 results are listed first in each column, followed by the 2000 results.



0-1 1-3 3-6 6 or more
2002 2000 2002 2000 2002 2000 2002 2000
Percentage 12% 8% 66% 62% 15% 17% 7% 13%

Table 6. Times they listen per day.


4-8 AM 8 AM-noon noon-4 PM 4-8 PM 8 PM - midnight midnight - 4 AM severe
2002 2000 2002 2000 2002 2000 2002 2000 2002 2000 2002 2000 2002 2000
Percentage 62% 73% 33% 26% 20% 22% 40% 47% 44% 43% 9% 9% 56% 58%

Table 7. Most common listening times.

(Listeners could indicate more then one listening time, so percentages will not equal 100%)



Listening times and how often respondents "tuned in" were virtually the same as in 2000 (Table 6and 7). There was a slight drop off in the early morning listening, but a slight gain in the evening.

Routine product usage was nearly the same between 2000 and 2002 (Table 8), with only minor variances. The RNS showed some decline in the "used always" category, but gained in the "used often", while the RVA improved in the "used rarely", but declined in the "never used". A correlation can be drawn between one category's gains coming from the other category's losses. The overall ratings were about the same between the years.


Products * 5=used always 4=used often 3=used sometimes 2=used rarely 1=never used Average Rating
2002 2000 2002 2000 2002 2000 2002 2000 2002 2000 2002 2000
AWS 43% 45% 27% 24% 21% 23% 7% 7% 2% 1% 4.0 4.1
LFP 78% 79% 17% 17% 4% 3% 1% 1% 1% 0% 4.7 4.7
HRR 44% 37% 26% 26% 21% 23% 7% 12% 4% 2% 4.0 3.8
CLI 17% 16% 19% 19% 29% 34% 24% 23% 11% 8% 3.1 3.1
NOW 62% 61% 21% 22% 16% 12% 0% 5% 0% 0% 4.5 4.4
RVA 5% 8% 11% 8% 16% 12% 29% 45% 39% 27% 2.1 2.3
RNS 30% 39% 38% 21% 17% 22% 10% 8% 4% 10% 3.8 3.7
HWO 58% 58% 26% 21% 13% 18% 3% 2% 1% 1% 4.4 4.3

Table 8. Usage of the routine products, graded on a 1-5 to scale.

* a listing of product abbreviations can be found in appendix A.

The quality of the routine products showed an increase in the "excellent" rating for nearly all the products (Table 9). Overall, all the products were judged a bit better than they were in 2000, with most average "grades" also slightly higher than 2000. The quality of the extended forecast was indicated as being improved from 2000, but still the least favorable of all the products.

Products * 5=excellent 4=very good 3=good 2=fair 1=poor Average Rating
2002 2000 2002 2000 2002 2000 2002 2000 2002 2000 2002 2000
AWS 50% 42% 35% 46% 13% 8% 2% 3% 0% 1% 4.3 4.2
CLI** 41% 35% 34% 45% 17% 16% 6% 3% 1% 1% 4.1 4.0
RVA*** 38% 29% 20% 37% 26% 24% 11% 6% 5% 4% 3.7 3.8
RNS 33% 40% 43% 39% 17% 12% 4% 4% 2% 5% 4.0 4.1
LFP 40% 31% 43% 44% 15% 23% 2% 2% 0% 0% 4.2 4.0
Extended 22% 13% 32% 32% 31% 31% 10% 18% 5% 6% 3.6 3.3
NOW 52% 40% 38% 45% 10% 11% 0% 3% 0% 1% 4.4 4.2
HWO 43% 49% 40% 31% 14% 15% 1% 5% 1% 0% 4.2 4.2

Table 9. Quality of the routine products, graded on a 1-5 to scale.

* a listing of product abbreviations can be found in appendix A.

The quality of the severe products also remained nearly unchanged from 2000 (Table 10), although there was a drop-off from the "excellent" to the "very good" category under the SVSs. However, all products improved in the lower 3 categories, with the average "grades" the same or higher than 2002.


Products * 5=excellent 4=very good 3=good 2=fair 1=poor Average Rating
2002 2000 2002 2000 2002 2000 2002 2000 2002 2000 2002 2000
Warnings 54% 56% 42% 32% 3% 10% 1% 2% 0% 0% 4.5 4.4
SVS 41% 51% 54% 35% 4% 11% 1% 3% 0% 0% 4.3 4.3
NOW 39% 38% 52% 46% 8% 12% 1% 3% 0% 1% 4.3 4.2
RNS 34% 35% 51% 40% 14% 15% 1% 6% 0% 4% 4.2 4.0

Table 10. Quality of the severe weather products, graded on a 1-5 to scale.

* a listing of product abbreviations can be found in appendix A.

Overall, the NWS La Crosse's transmitters continued high quality programming according to those that responded.

IV. Conclusions

The response to the request for feedback was excellent and provided the NWS La Crosse with a large amount of information that can be used to improve and enhance its current broadcasts. Without undertaking such a task, there is no way to properly gauge the quality and content of our broadcasts. Also, using an online, web based form, was extremely helpful in data gathering. The vast majority of the respondents used this online form. It is recommended that all offices that attempt to gather feedback use an online form.

The diligent work put into improving the sound quality of the automated voice continues to reap rewards, as evidenced by the listeners. The new VIP voices were favored, and the work on the dictionary has improved greatly upon these initial voices. The NWS La Crosse will move forward in 2003 and use both the male and female voices for broadcasts, with the male handling the routine products, and the female used for severe weather. We believe that this will meet the majority of the listeners preferences, plus, give a distinct voice to "inclement and dangerous" weather. With the female voice being used exclusively for these weather scenarios, it will heighten listener response to the broadcasts.

According to the listeners, even though they have improved, work needs to be done on the quality of our extended forecasts. Too many listeners felt they were poor, or even useless at times. This was the lowest rated product when referencing the quality, and efforts to provide better extended forecasts will continue to be addressed by the NWS-La Crosse forecast staff.

While the Hazardous Weather Outlook (HWO) remained a very popular product, broadcasting this product when there was "no hazardous weather" concerns was judged as unnecessary. Many users simply commented, "don't play it if there is no hazardous weather expected". In response, the HWO will now only be broadcast when a threat is listed in the HWO product. When it is in the broadcast, it will continued to be played at specific times, making it an easy to find product for the listeners.

While remaining a high quality product, the Severe Weather Statements (SVSs) dropped a bit in listeners perceived quality. Listeners want more "up-to-date" information; basically, the more the better. The NWS La Crosse will attempt to meet this need by emphasizing to the staff the benefits of more frequent follow statements to severe weather warnings.

In a similar vein, respondents indicated they would like to hear more "live" broadcasts, whether it be radar updates or information about ongoing severe weather. While they approved of the automated voices, they also wanted some human interaction when possible. The NWS La Crosse will encourage all NWR operators to attempt live updates during severe weather, and to include all different types of inclement weather.

Listener comments indicated a need for more frequent broadcasts of the near-term (1-3 day) forecast. Since the advent of the 7 day forecast, the broadcast cycle has become longer. Several respondents wished that the first part of the forecast could be played more often, or repeated after the extended forecast was broadcast. The NWS La Crosse will attempt to address these concerns by investigating ways to broadcast the initial part of the forecast more frequently. Along with re-playing the first period after the extended, separating the 1-3 day forecast from the 4-7 day forecast, and broadcasting the 1-3 day more often will be an option.

V. Acknowledgments

The author would like to thank Randy Breeser, DAPM and Todd Shea, WCM at NWS-La Crosse for their assistance and aid with this project. Also deserving thanks are Lynn Zintz, ASA of NWS-La Crosse, and the rest of the staff who helped in taking feedback form requests and mailing forms out.

 

Appendix A



A listing of product names and their acronyms.



AWS: Area Weather Summary

CLI: Climate Summary

HRR: Hourly Weather Roundup (a roundup of hourly temperatures and weather conditions)

HWO: Hazardous Weather Outlook (a discussion of the severe weather potential for the forecast area)

LFP: Local 2-day forecast, including a 3-7 extended forecast

NOW: Short Term Forecasts

RNS: Radar summaries (a NWR only product, recorded by the CRS operator when precipitation is in or close to the forecast area)

RVA: Daily river and forecast stages

SVS: Severe weather statements (updates conditions for warnings)


USA.gov is the U.S. government's official web portal to all federal, state and local government web resources and services.