Actions

Csu radartools tutorial

From Lrose Wiki

Overview


This tutorial will walk through the steps to read and convert a raw radar file, quality control the data (removing unwanted artifacts and echoes, unfold radial velocity and calculate Kdp), then grid the data, and finally process the gridded data to end up with hydrometeor identification, precipitation rates, and DSD information. This is generally focused on surface polarimetric X, C, or S-band radar. We will apply a series of corrections

  • Unfold radial velocity
  • Calculate Kdp
  • Apply thresholds to remove non-meteorological data
  • Calculate the attenuation and differential attenuation
  • Despeckle and remove 2nd trip

Modules available from GitHub


Scientific Background



Step-by-step Instructions


The Example file is from the C-band polarimetric SEA-POL radar. The raw data are in sigmet native format.

Python Imports

 import numpy as np
 from copy import deepcopy
 import os
 import pyart
 from CSU_RadarTools.csu_radartools import csu_kdp
 from CSU_RadarTools.csu_radartools import csu_misc


Convert to CFRADIAL


Convert the data to cfradial data to be more easily transferable to other programs.
We can do this using RadxConvert from a command line, but we can also do it from python with the os command.

mydir = '/Users/bdolan/scratch/LROSE/SEA-POL/ppi/'
os.system(f'RadxConvert -f {mydir}SEA20190917_071005 -outdir {mydir}cfrad')


This will put the output in a directory for the date in directory 'cfrad'.
Now we have a file called:
cfrad.20190917_071006.545_to_20190917_071353.149_SEAPOL_SUR.nc

There are a lot of variables in this file, but the ones of interest are:
DBZ: Reflectivity
ZDR: Differential Reflectivity
UNKNOWN_ID_82: Correlation Coefficient
VEL: Radial velocity
SQI: Signal Quality Index
PHIDP: Differential phase
SNR: Signal to noise ratio

QC with CSU_Radartools and PyART

Do some quality control using PyART and CSU_Radartools.

Special handling for SEA-POL

These are PECULIARS to the SEA-POL data. YMMV

Blanked sector over the wheelhouse

We will look for large blocks of azimuthal jumps to mask them. Get the difference between each azimuth and look for jumps larger than 30º. If so, mask the data so it plots correctly.

az_diff = np.diff(radar.azimuth['data'])
jumps = np.where(np.abs(az_diff) >= 30.0)[0]
if len(jumps):
  for f in radar.fields.keys():
     for j in jumps:
        radar.fields[f]['data'][j].mask = True
        radar.fields[f]['data'][j+1].mask = True


The correlation coefficient is incorrect, but we can calculate it from UNKNONW_ID_82 field.

Now correct the UNKNOWN_ID_82 field to transform into the correlation coefficient (Again, this is only needed for this data).

ccorr = deepcopy(radar.fields['UNKNOWN_ID_82']['data'])
ccorr[ccorr<0] += 65535
ccorr = (ccorr-1)/65533.0
radar.add_field_like('RHOHV','CC',ccorr)