A COMPARISON OF THE WSR-88D BUILD 9 TVS ALGORITHM AND THE BUILD 10 TDA OVER THE WFO NORTHERN INDIANA COUNTY WARNING AREA DURING THE 1998 CONVECTIVE SEASON

 

Todd E. Holsten, Kevin M. Barjenbruch, Julie L. Adolphson
National Weather Service Office
North Webster, Indiana

 

I. INTRODUCTION

Detecting tornadoes is an important component of the NOAA/National Weather Service's (NWS) mission of warning for severe local storms. In an average year, 800 tornadoes are reported nationwide, resulting in 80 deaths and over 1,500 injuries (NOAA 1992).

During severe weather operations, NWS meteorologists typically review radar products on a scan by scan basis, as one method for determining if a tornado warning is necessary. Two of the products available from the Weather Surveillance Radar-1988 Doppler (WSR-88D) are the Tornadic Vortex Signature (TVS) and Elevated Tornadic Vortex Signature (ETVS) (NOAA 1998) . TVSs and ETVSs are displayed in graphical form on the Principal User Processor (PUP) component of the WSR-88D. TVSs are displayed as red, filled, inverted triangles, while ETVSs are displayed as red, open, inverted triangles. TVS and ETVS information is also displayed in tabular form as part of the Combined Attribute Table (CAT). Additional TVS and ETVS information is available in alphanumeric products.

In January 1999, a new Tornado Detection Algorithm[hereafter referred to as B10 TDA] (NOAA 1998) was implemented at Weather Forecast Office Northern Indiana (WFO/ IWX) with WSR-88D Software Build 10.0. This TDA replaced the Tornadic Vortex Signature (TVS) Algorithm, (hereafter referred to as B9 TVS Algorithm), implemented with WSR-88D Software Build 9.0. (NOAA 1996). The TDA was developed by the NOAA/National Severe Storms Laboratory (NSSL) and is designed to detect significant shear regions in the atmosphere. The TDA uses multiple velocity thresholds to locate shear regions, and classifies these regions according to altitude and strength, ultimately identifying circulations.

The purpose of this study was to assess the performance of the B10 TDA, for various shear strength and depth criteria, as well as to assess the B9 TVS Algorithm. This assessment was completed through the use of the WSR-88D Algorithm Testing and Display System, Version 9.0 (WATADS 9.0), developed by the NSSL (WATADS 1999). WATADS processes WSR-88D Level II data through the use of several meteorological algorithms developed by NSSL. The data were analyzed for five tornadic event days, during which 16 tornadoes occurred within or adjacent to the WFO/IWX County Warning Area (CWA) (Figure 1) (NOAA 1998).

Figure 1 Figure 1. KIWX County Warning Area in white.



II. B9 TVS ALGORITHM AND B10 TDA DESCRIPTION

The B9 TVS Algorithm is initiated once the mesocyclone algorithm detects a mesocyclone. For each 2-D feature within the mesocyclone, the area is expanded by the amount specified by a search percentage (PCT), then searched for minimum-maximum velocities, and a shear value is calculated. If a threshold shear value is reached on at least two elevations within the mesocyclone, a TVS is declared.

The B10 TDA uses multiple velocity thresholds to locate shear regions, and classifies these regions according to altitude and strength, ultimately identifying circulations. It uses the following three-step process to identify circulations:

A. 1-D pattern vectors are identified on each elevation slice. In the TDA, a pattern vector is a region of gate-to-gate shear, which means the velocity difference is calculated between range bins located on adjacent azimuths at the same range. A minimum shear value is required for a pattern vector to be identified.
 
B. 2-D features are created by combining the 1-D pattern vectors. A least three pattern vectors are needed to declare a 2-D feature. The B10 TDA uses six velocity difference thresholds to identify pattern vectors. This technique allows the algorithm to isolate core circulations which may be embedded within regions of long azimuthal shear.
 
C. 3-D features are then created by vertically correlating the 2-D circulations identified at each elevation. If a feature contains at least three vertically correlated 2-D circulations, it is declared a 3-D circulation and is identified as either a TVS if the 3-D circulation base is at 0.5° or below 600 m above radar level or an ETVS if the 3-D circulation base is above 0.5° or 600. The depth of the 3-D circulation must be at least 1.5 km for a TVS or ETVS to be declared.

Several significant differences exist between the B10 TDA and the B9 TVS Algorithm. First, in the B10 TDA, an algorithm identified mesocyclone need not exist for a TVS or ETVS to be identified, as opposed to the B9 TVS Algorithm, where a mesocyclone must be detected before the TVS Algorithm is initiated. Secondly, in the B10 TDA, the TVS must be gate-to-gate, as opposed to the B9 TVS Algorithm, where a PCT is used. Finally, 30 adaptable parameters exist within the B10 TDA, compared to only two in the B9 TVS Algorithm. The Operational Support Facility (OSF) has delegated authority for changing some of these values to the Unit Radar Committee (URC) in the form of parameter sets (Table 1).

 

TABLE 1
TVS/TDA Performance
Adaptable Parameter Sets
Parameter
Set*
MINIMUM 3D FEATURE LOW-LEVEL DELTA VELOCITY (m/s) MINIMUM TVS DELTA VELOCITY(m/s) MINIMUM 3D FEATURE DEPTH (km)
Default Setting 25 36 1.5
Minimized** 56 74 5.0
Squall Lines 27 27 1.6
Tropical Cyclones 14 44 2.0

*Parameter Sets under URC level of change authority
**Minimized set emulates POD and FAR of Build 9 TVS algorithm used prior to Build 10

III. METHODOLOGY

The TDA was evaluated through the use of WATADS, which processes, analyzes, and displays WSR-88D Level II data (base reflectivity, base velocity, and spectrum width) (Crum et al. 1993). TVS and ETVS information is displayed in WATADS in tabular form.

The NWSO IWX (KIWX) WSR-88D archive level II date were processed and analyzed to evaluate the TDA. For all of the cases, Volume Coverage Pattern (VCP 11), which utilizes 14 elevation scans in five minutes, was employed (Crum et al. 1993, Klazura and Imy 1993).

Verification data were obtained solely from Storm Data (NOAA 1998). Of the 16 tornadoes examined, one was classified as a F2, eight were classified as a F1, and seven as a F0. Ground truth reports were checked for spatial and temporal errors using WSR-88D Level IV data, displayed on the PUP. When temporal adjustments were made, the time of the storm report was adjusted based on the radar location of the severe storm and the time of the elevation scan. All ground truth reports with respect to location were valid.

IV. EVALUATION PROCEDURES

The procedure for relating TDA/TVS cell output to tornado reports was as follows:

A. KIWX WSR-88D archive level II and IV data were processed from just prior to the convective activity entering into the IWX County Warning Area (CWA) until just after the convective activity departed the IWX CWA for each of the five tornadic events.
B. STORM DATA tornado reports were adjusted temporally, if needed, to match with storm cells observed in the radar data. However no adjustments to location were necessary.
C. Manual scoring was then completed using STORM DATA and TDA/TVS algorithm outputs in conjunction with SCIT algorithm output. The scoring was completed as follows:
  1. Four different adaptable parameter sets were used in this study including the following:

    a. Build 9 TVS algorithm with a modified Tornado Threshold Shear (TTS) of 35 s-1 and modified Threshold Pattern Vector (TPV) of six to match the build 9 TVS parameter set which was in place during the 1998 convective season.

    b. Build 10 TDA algorithm was used with the default, squall line, and tropical adaptable arameter settings (Table 1).
  2. STORM DATA tornado reports were then correlated in space and time and adjusted if needed to correspond with base reflectivity and velocity data. Once the true times and locations of tornadoes had been established, azimuths and ranges from the radar were annotated for use with volume scan times in Step 3 below.
  3. Tornado time windows were then calculated according to OSF TDA scoring guidelines (Lee 1998) using the following procedure:

    a. Determine which radar volume scan time defines the beginning of the tornado event time window

    b. Establish the time of the tornado from STORM DATA damage reports

    c. Subtract 20 minutes from the start of the tornado

    d. Find the volume scan that contains the time established in Step c., above (Tornado window beginning time)

    e. Determine which volume scan time defines the end of the tornado event time window

    f. Select the next volume scan time (Tornado window ending time)

  4. Each volume scan identified in the Tornado time windows was then checked with velocity data. If range folding or velocity dealiasing occurred at ranges affecting the analysis of the storm cell, then that volume scan time was eliminated from that Tornado time window as not to penalize the algorithm for not finding a circulation.
  5. Radar volume scans that fell within the Tornado time windows were then tabulated. Tornado times and algorithm detection AZRANS/RANGES corresponding to the volume scans were then used to determine hits, misses, and false alarms.
  6. Correct non-occurrences were then determined by the total number of storm cells identified by the Storm Cell Identification and Tracking (SCIT) algorithm during the entire data run minus the storm cells identified by the algorithm as containing tornadic circulations.
  7. Duplicate tornado detections associated with the same tornadic or non-tornadic cell were thrown out so as not to penalize the algorithm for over-warning the same cell.
  8. Finally, the algorithm detections were scored using the following performance statistics where:

A=sum of algorithm detections in all tornado time windows (hits)
B=sum of volume scan times in all tornado time windows with null entries (misses)
C=sum of algorithm detections not associated with tornado time windows (false alarms)
D=number of storm cells with no accompanying algorithm detection

Probability Of Detection (POD) = A/(A+B)
False Alarm Rate (FAR) = C/(A+C)
Miss Rate (MR) = B/(A+B)
Critical Success Index (CSI) = A/(A+B+C)
Skill Score (S) = [2(AD+BC]/[(A+C)(C+D)+((A+B)(B+D)]

V. PERFORMANCE RESULTS

Using the evaluation procedures given in Section 4, performance results for the B9 TVS Algorithm and B10 TDA were generated. Given that the results below are based on just five tornadic event days, during a single convective season, the sample size should be taken into account when considering the results of this study.

Table 2 contains the individual tornadic event days and composite POD, FAR, MR, CSI, and S scores for the B9 TVS Algorithm and the B10 TDA with the Default, Squall Line and Tropical adaptable parameter sets.

Figure 2 illustrates the PODs for the four parameter sets. The B10 TDA Tropical yielded the highest POD, 0.231, with the B10 TDA Squall Line and B10 TDA Default producing the second and third highest PODs, respectively. The B9 TVS Algorithm produced the lowest POD, just 0.104.

An examination of Figure. 3 indicates that the B10 TDA Default produced the lowest FAR, 0.433, followed closely by the B10 TDA Squall Line. The highest FAR, 0.600, was produced by the B9 TVS.

 

TABLE 2

 

 

OVERALL EVENT SCORES
POD
A/(A+B)
FAR
C/(A+C)
MR
B/(A+B)
CSI
A/(A+B+C)
S [2(AD+BC)]/[A+C)
(C+D)+(A+B)(B+D)]
Parameter Set

 

Build 9 TVS
0.000 0.000 1.000 0.000 0.000

03 May 1998

0.020 0.500 0.980 0.020 0.039

11 June 1998

0.000 1.000 1.000 0.000 0.002

04 July 1998

0.000 1.000 1.000 0.000 0.006

19 July 1998

0.500 0.500 0.500 0.333 0.500

25 August 1998

0.104 0.600 0.896 0.071 0.109

Overall

Build 10 Default
0.000 0.000 1.000 0.000 0.000

03 May 1998

0.000 0.000 1.000 0.000 0.000

11 June 1998

0.167 0.500 0.833 0.143 0.250

04 July 1998

0.000 1.000 1.000 0.000 0.003

19 July 1998

0.462 0.667 0.538 0.240 0.388

25 August 1998

0.126 0.433 0.874 0.077 0.128

Overall

Build 10 Squall Line
0.000 0.000 1.000 0.000 0.000

03 May 1998

0.082 0.000 0.918 0.082 0.149

11 June 1998

0.167 0.667 0.833 0.125 0.224

04 July 1998

0.000 1.000 1.000 0.000 0.002

19 July 1998

0.615 0.600 0.385 0.320 0.484

25 August 1998

0.173 0.453 0.827 0.105 0.172

Overall

Build 10 Tropical
0.000 0.000 1.000 0.000 0.000

03 May 1998

0.140 0.000 0.860 0.140 0.242

11 June 1998

0.167 0.889 0.833 0.071 0.138

04 July 1998

0.000 1.000 1.000 0.000 0.007

19 July 1998

0.846 0.738 0.154 0.250 0.396

25 August 1998

0.231 0.525 0.769 0.092 0.157

Overall

3.396 11.0944 18.6042 1.8712 3.0676

Subtotals

0.170 0.555 0.930 0.094 0.153

Overall

 

Figure 2 Figure 2. Parameter set probability of detection.



Figure 3 Figure 3. Parameter set false alarm ratio.



The MR results are depicted in Figure 4. The B10 TDA Tropical produced a MR of 0.769, by far the lowest. The highest MR, 0.896, was produced by the B9 TVS Algorithm.

Figure 4 Figure 4. Parameter set miss ratio.



Figures 5 and 6 depict the CSI and S scores, respectively, indicating the overall performance of each parameter set. Both the CSI and S scores indicate that the B10 TDA Squall Line performed best, yielding a CSI score of 0.105 and a S score of 0.172. The B10 TDA Tropical was second best, with CSI and S scores of 0.092 and 0.157 respectively. The CSI and S scores were lowest for the B9 TVS Algorithm.

Figure 5 Figure 5. Parameter set critical success index.



Figure 6 Figure 6. Parameter set skill scores.



 

VI. CONCLUSION

Tornado detection is an integral part of the NWS warning program. Radar meteorologists can certainly benefit from TVS detections and the information they provide in real-time, along with ground truth reports from spotters, law enforcement, and others. This allows for a more informed decision for future impacts to downstream counties and subsequently the need for additional tornado warnings.

With the B9 TVS algorithm in the past, TVS's were quite rare. Even non-existent in some geographical areas across the country. With the implementation of the B10 TDA algorithm, many more detections are likely, giving radar meteorologists important real-time storm information which should aid in the warning process for tornadic producing thunderstorms.

This study analyzed five tornadic event days. Although each event was characterized by differing synoptic and mesoscale conditions, the group made for a good test case with 16 documented tornado occurrences within or adjacent to the NWSO IWX CWA. As seen in Table 2, the B9 TVS had the worst overall scores. All B10 adaptable parameter sets, including the Tropical set, showed considerable improvement over the B9 TVS. Overall, the B10 Squall Line adaptable parameter set performed better than the other parameter sets used in this study. FAR scores were still quite high, but were much lower then the B9 TVS.

Perhaps with the inclusion of additional tornadic events days and adjustment to some of the other adaptable parameters such as the minimum reflectivity threshold, additional improvement in the B10 adaptable parameter set scores noted in Table 2 is possible. Additional research and cases are needed, but initial results are encouraging with the new B10 TDA algorithm.

VII. REFERENCES

Crum, T.D., R.L. Alberty, and D. W. Burgess, 1993: Recording, archiving, and using WSR-88D data. Bull. Amer. Meteor. Soc., 74, 645-653.

DOC, NOAA, National Weather Service, 1992: Tornadoes, Nature's Most Violent Storms. DOC, NOAA, NWS, Washington, D.C., 12pp.

____________, ____________, National Severe Storms Laboratory, 1999: WATADS (WSR-88D Algorithm Testing and Display System) - Reference Guide for Version 10.1. Storm Scale Applications Division, National Severe Storms Laboratory, Norman, Oklahoma.

Klazura, G.E. and D.A. Imy, 1993: A description of the initial set of analysis products available from the NEXRAD WSR-88D system. Bull. Amer. Meteor. Soc., 74, 1293-1311.

Lee, B., 1999: Optimizing the performance of mesocyclone and tornado detection algorithms using WATADS for NWS warning operations (unpublished). NEXRAD Operational Support Facility, Norman, Oklahoma, 13pp.

National Climatic Data Center, 1998: Storm data and unusual weather phenomena with late reports and corrections. NOAA, NESDIS, National Climatic Data Center, Ashville, NC, 40(5-8).

Operations Training Branch, 1996: Build 9.0 Precursor Training, 9-12. NOAA, Operations Training Branch, Norman, Oklahoma.

____________, 1998: WSR-88D Build 10 Training, 4-30, WSR-88D. OSF, Operations Training Branch, Norman, Oklahoma.

 


USA.gov is the U.S. government's official web portal to all federal, state and local government web resources and services.