The CASA Integrated Project 1 Networked Radar System

18
The CASA Integrated Project 1 Networked Radar System FRANCESC JUNYENT AND V. CHANDRASEKAR Colorado State University, Fort Collins, Colorado D. MCLAUGHLIN AND E. INSANIC University of Massachusetts—Amherst, Amherst, Massachusetts N. BHARADWAJ Colorado State University, Fort Collins, Colorado (Manuscript received 13 February 2009, in final form 29 July 2009) ABSTRACT This paper describes the Collaborative Adaptive Sensing of the Atmosphere (CASA) Integrated Project 1 (IP1) weather radar network, the first distributed collaborative adaptive sensing test bed of the Engineering Research Center for Collaborative Adaptive Sensing of the Atmosphere. The radar network and radar node hardware and software architectures are described, as well as the different interfaces between the integrated subsystems. The system’s operation and radar node control and weather data flow are explained. The key features of the radar nodes are presented, as well as examples of different data products. 1. Introduction Operational weather radars have traditionally been physically large instruments, transmitting high peak power, single-polarization S- and C-band microwave signals. These long-range, stand-alone units operate in- dividually and almost independently of the weather echo conditions, continuously scanning over large volumes of the troposphere. When operating over large coverage domains, the broadening of the radar beam at farther ranges results in spatial resolution degradation, whereas fixed-mode operation and scanning patterns limit the temporal reso- lution at which the atmospheric phenomena are sampled. In addition, the earth’s curvature and terrain-induced blockage (NRC 2002) can reduce coverage at low alti- tudes, preventing the observation of low-altitude phe- nomena such as tornadoes and limiting the accuracy of precipitation estimates near the ground. In the large coverage area of such radar systems, a variety of weather phenomena can coexist at any given time, each with a varying degree of interest to the different user com- munities (such as weather forecasters, emergency man- agers, researchers, etc.). To maximize the radar utility (understood as the system’s capability to best fulfill each particular user’s expectations), the radar should be able to adapt and operate according to the current needs of the users. Using agile short-range weather radars with over- lapping coverage domains that are controlled by an automated entity that continuously optimizes the radar network utility is one way to solve the range related is- sues and maximize the value of the radar observations (McLaughlin 2001; Chandrasekar and Jayasumana 2001). With this vision, the National Science Foundation (NSF) established the Engineering Research Center for Col- laborative Adapting Sensing of the atmosphere (CASA). CASA is a consortium of four universities [Colorado State University, University of Massachusetts (lead university), University of Oklahoma, and University of Puerto Rico at Mayagu ¨ ez] and a partnership with industry and gov- ernment laboratories. The common objective of CASA is to change the weather sensing paradigm through distrib- uted collaborative adaptive sensing (DCAS; McLaughlin et al. 2005), improving the coverage of the lowest por- tion of the atmosphere through coordinated scanning of Corresponding author address: Francesc Junyent, Colorado State University, 1373 Campus Delivery, Fort Collins, CO 80523-1373. E-mail: [email protected] JANUARY 2010 JUNYENT ET AL. 61 DOI: 10.1175/2009JTECHA1296.1 Ó 2010 American Meteorological Society Unauthenticated | Downloaded 05/05/22 05:54 PM UTC

Transcript of The CASA Integrated Project 1 Networked Radar System

Page 1: The CASA Integrated Project 1 Networked Radar System

The CASA Integrated Project 1 Networked Radar System

FRANCESC JUNYENT AND V. CHANDRASEKAR

Colorado State University, Fort Collins, Colorado

D. MCLAUGHLIN AND E. INSANIC

University of Massachusetts—Amherst, Amherst, Massachusetts

N. BHARADWAJ

Colorado State University, Fort Collins, Colorado

(Manuscript received 13 February 2009, in final form 29 July 2009)

ABSTRACT

This paper describes the Collaborative Adaptive Sensing of the Atmosphere (CASA) Integrated Project 1

(IP1) weather radar network, the first distributed collaborative adaptive sensing test bed of the Engineering

Research Center for Collaborative Adaptive Sensing of the Atmosphere. The radar network and radar node

hardware and software architectures are described, as well as the different interfaces between the integrated

subsystems. The system’s operation and radar node control and weather data flow are explained. The key

features of the radar nodes are presented, as well as examples of different data products.

1. Introduction

Operational weather radars have traditionally been

physically large instruments, transmitting high peak

power, single-polarization S- and C-band microwave

signals. These long-range, stand-alone units operate in-

dividually and almost independently of the weather echo

conditions, continuously scanning over large volumes of

the troposphere.

When operating over large coverage domains, the

broadening of the radar beam at farther ranges results

in spatial resolution degradation, whereas fixed-mode

operation and scanning patterns limit the temporal reso-

lution at which the atmospheric phenomena are sampled.

In addition, the earth’s curvature and terrain-induced

blockage (NRC 2002) can reduce coverage at low alti-

tudes, preventing the observation of low-altitude phe-

nomena such as tornadoes and limiting the accuracy of

precipitation estimates near the ground. In the large

coverage area of such radar systems, a variety of weather

phenomena can coexist at any given time, each with

a varying degree of interest to the different user com-

munities (such as weather forecasters, emergency man-

agers, researchers, etc.). To maximize the radar utility

(understood as the system’s capability to best fulfill each

particular user’s expectations), the radar should be able

to adapt and operate according to the current needs of

the users.

Using agile short-range weather radars with over-

lapping coverage domains that are controlled by an

automated entity that continuously optimizes the radar

network utility is one way to solve the range related is-

sues and maximize the value of the radar observations

(McLaughlin 2001; Chandrasekar and Jayasumana 2001).

With this vision, the National Science Foundation (NSF)

established the Engineering Research Center for Col-

laborative Adapting Sensing of the atmosphere (CASA).

CASA is a consortium of four universities [Colorado State

University, University of Massachusetts (lead university),

University of Oklahoma, and University of Puerto Rico

at Mayaguez] and a partnership with industry and gov-

ernment laboratories. The common objective of CASA is

to change the weather sensing paradigm through distrib-

uted collaborative adaptive sensing (DCAS; McLaughlin

et al. 2005), improving the coverage of the lowest por-

tion of the atmosphere through coordinated scanning of

Corresponding author address: Francesc Junyent, Colorado State

University, 1373 Campus Delivery, Fort Collins, CO 80523-1373.

E-mail: [email protected]

JANUARY 2010 J U N Y E N T E T A L . 61

DOI: 10.1175/2009JTECHA1296.1

� 2010 American Meteorological SocietyUnauthenticated | Downloaded 05/05/22 05:54 PM UTC

Page 2: The CASA Integrated Project 1 Networked Radar System

low-power, short-range networked radars. In Junyent

and Chandrasekar (2009), a framework to study the

coverage characteristics of such radar networks was de-

veloped. The first DCAS demonstration test bed was

deployed in Oklahoma in early 2006: a network of four

low-power, short-range, dual-polarization Doppler radar

units, which will refer to as CASA Integrated Project 1

(IP1). The IP1 radar network is developed with the

system goal of mapping severe weather events in the

lowest 3 km of the troposphere with high spatial and

temporal resolution. Each radar node is developed to

accomplish this system goal through the coordinated

interaction with other radars in the network via a real-

time, closed-loop software control system.

This paper describes the IP1 radar network and radar

node features and operational capabilities. Special em-

phasis is placed on the system aspects that enable coor-

dinated radar operation and on other features that provide

substantial improvements over existing approaches. In

particular, the IP1 radar network is able to sample the

atmosphere with high spatiotemporal resolution, espe-

cially at low altitudes. The dual-polarization capabilities

of the network and simultaneous multiple-radar obser-

vations of weather phenomena enable the retrieval of

enhanced data products, including attenuation-corrected

reflectivity, dual-polarization parameters, and vector wind

fields. In addition, the modular radar control, data pro-

cessing, and communications software architecture allows

variations in the network topology, centralized and/or

distributed network control, and weather information

extraction, making the extension of the network easy

through the addition of potentially heterogeneous radar

nodes.

The paper is organized as follows: section 2 describes

the IP1 radar network architecture and its layout; sec-

tion 3 describes the architecture of a single IP1 radar.

Section 4 describes the radar operation and key features.

Section 5 shows sample data collected from the IP1 ra-

dar network.

2. IP1 radar network infrastructure

The IP1 radar network is deployed in southeastern

Oklahoma, as shown in Fig. 1. The radar nodes are in-

stalled along interstate 44, southwest of Oklahoma City,

Oklahoma, and are located within the coverage area of

the KFDR and KTLX Weather Surveillance Radar-1988

Doppler (WSR-88D) radars. The IP1 radar network im-

plementation consists of four polarimetric weather radar

nodes, designated with the Federal Communications

Commission (FCC) identifiers KSAO, KRSP, KCYR,

and KLWE, and a cluster of computers known as System

Operation and Control Center (SOCC) running a suite

of network control algorithms known as Meteorological

Command and Control (MCC). The SOCC is located

in the National Weather Center building in Norman,

Oklahoma. In Fig. 1, the IP1 radar network layout is il-

lustrated, with the four radar nodes located in the towns

of Chickasha (KSAO), Rush Springs (KRSP), Cyril

(KCYR), and Lawton (KLWE), Oklahoma, and each

radar node approximately 30 km away from the next

unit. Radio links provide Internet connectivity to the

radar node sites with a bandwidth of 4 Mbps. Table 1

contains the FCC identifier and coordinates of each radar

site, as reported by the global positioning system (GPS)

receiver unit with which each radar node is equipped.

As a first approximation, the IP1 radar network can be

seen as composed of two triangular network cells with

the radar nodes at a distance of 30 km from each other

and each radar node with a maximum range Rmax 5

40 km. In the networked radar model introduced by

Junyent and Chandrasekar (2009), this network cell type

can be described by the number of radars (N 5 3) and

the overlap factor (M 5 4/ffiffiffi

3p

), and it can be imagined as

a unit of a larger extended network deployment. Al-

though limited in size, having a network with two cells

(KLWE–KCYR–KRSP and KCYR–KRSP–KSAO) that

share two radars (KCYR and KRSP) and have multiple

FIG. 1. IP1 weather radar network layout from the bottom:

KLWE (Lawton), KRSP (Rush Springs), KCYR (Cyril), and

KSAO (Chickasha) radar coverage. Coverage circles are 40 km in

radius.

62 J O U R N A L O F A T M O S P H E R I C A N D O C E A N I C T E C H N O L O G Y VOLUME 27

Unauthenticated | Downloaded 05/05/22 05:54 PM UTC

Page 3: The CASA Integrated Project 1 Networked Radar System

radar coverage has enough complexity to address the

generic problem of targeted coordinated scanning built

in the DCAS paradigm.

The radar node sites, as shown in Fig. 1, are composed

of a fenced area containing a short to medium height

tower, an air conditioning unit, and a connection to the

power grid. Other communications, data storage, and

computing equipment are housed inside a small cabin

inside or next to the fenced area. The tower height is

determined by different factors depending on the site,

including preexisting infrastructure. In the KLWE and

KRSP radar node sites, a 20-ft tower section is sufficient

to clear the surrounding trees, whereas in the KCYR

radar node site a 50-ft tower is needed for the dedicated

point-to-point communication radio links. The KSAO

radar node is deployed on a preexisting 50-ft tower

clearing the surrounding buildings. Connectivity to the

Internet at all radar node sites is provided by OneNet,

a CASA partner.

The radar nodes are connected via the Internet to the

centralized control site known as SOCC, a cluster of

computers and storage devices, which house the scan

rules and algorithms (Zink et al. 2005) that are respon-

sible for the network automated operation. The MCC

continuously ingests and stores data files received from

the radar nodes, detects the relevant weather features in

the individual and overlapping radar data, and creates

a list of tasks associated to the detected features. The

detected features and their associated tasks are then

used to generate optimized scan strategies that are fed

back into the radar nodes with a 60-s period. This real-

time, closed-loop, automated network operation system

is illustrated in Fig. 2.

The general radar node architecture is presented in

Fig. 3. It is divided into three different sections, according

to their physical location. The tower-top subsystems are

housed inside an air-conditioned radome, which provides

a weather-proofed, temperature-controlled environment

for the radar electronics. The radome is composed of

a half-sphere section on top of a cylindrical section and

is 8 ft in diameter and 8 ft in height.

The tower-top rotating subsystems are those located

above the azimuthal axis of motion and include the ra-

dar transmitter, receiver, data acquisition system, and

elevation motion actuator. The tower-top nonrotating

subsystems are housed in an open-frame rack on the

radome floor. They include a computer acting as the

radar motion controller; a gigabit Ethernet switch; an

Ethernet-controlled thermostat; two webcams; and an

Ethernet-controlled outlet strip, which is used to remotely

and independently power cycle the different subsystems.

At the tower base, there is a second gigabit Ethernet

switch connected via optic fiber to the one on the tower

top, a computer performing signal processing and com-

munication tasks referred to as the sensing node signal

processing computer (SNSPC) and a redundant array of

independent disks (RAID), which stores the obtained

radar data. A network router and Ethernet radio link

provide Internet connectivity.

The radar node is completely controllable from any

remote location with access to the Internet. This is driven

by the fact that the deployment towers have limited ac-

cessibility, as well as the fact that ultimately the network

operation could be controlled by algorithms distributed

over the network. Figure 4 shows a flow diagram of the

radar node signals and controls.

Based on a client–server architecture, the radar user,

which can be the MCC algorithms running at the SOCC

or any other authorized entity connecting to it, can send

high-level commands specifying radar parameters and

actions. Positioning controls specifying start and stop

angles, scan speeds, increment steps, axis of motion, and

time of execution can be issued. Arbitrary waveforms

FIG. 2. IP1 weather radar network architecture: the radar nodes

are connected via the Internet to a central control site, which au-

tomatically generates radar control commands based on detected

features in the incoming radar data stream.

TABLE 1. IP1 network radar nodes location.

Location FCC identifier Lat (8) Lon (8) Alt (m)

Chickasha KSAO 35.0312 297.9567 353.99

Rush Springs KRSP 34.8129 297.9313 414.84

Lawton KLWE 34.6238 298.2720 377.45

Cyril KCYR 34.8739 298.2514 445.30

JANUARY 2010 J U N Y E N T E T A L . 63

Unauthenticated | Downloaded 05/05/22 05:54 PM UTC

Page 4: The CASA Integrated Project 1 Networked Radar System

can be generated to control the transmitter and receiver

settings, and the number of range gates, decimation fac-

tor, and passband filter can be set on the data acquisition

system. In addition, the down-conversion frequencies in

both the analog receiver and the data acquisition sys-

tem’s digital receiver are also controllable. The high-

level commands sent by the user are broken down and

interpreted by the different subsystems to which they

are addressed, and the radar node state is reported back

as part of the radar’s weather data stream.

From a maintenance point of view, the IP1 radar

network has a three-tier approach. First, the whole sys-

tem is continuously monitored remotely, tracking that

the operating parameters are within expected ranges.

Second, a local field technician from OneNet is available

on a need basis to work in remote coordination with the

IP1 network radar engineer (not at site) in general site

maintenance (e.g., Internet access, buildings, electric util-

ities, air conditioning, etc.), minor corrective actions, and

debugging efforts. Third, should a radar failure happen, it

can typically be diagnosed through the above-mentioned

remote monitoring tools, allowing the IP1 radar engi-

neer to arrive at the site with the necessary spare parts to

resolve the issue. This approach allows minimizing trips

to the radar sites and reducing the amount of time spent

in the field diagnosing problems. Scheduled radar hard-

ware preventive maintenance is mainly inspection and

testing of mechanical, electrical, and radio frequency

(RF) subsystems (roughly once every 3 months) and

magnetron replacement (about once a year) and sub-

sequent transmitter calibration, whereas unscheduled

maintenance has to do with failed computer parts, mi-

crowave components, cabling, and interconnections, as

well as issues arising from Internet network access and/

or main power outages. Although the IP1 radar network

is an evolving research platform undergoing changes

and modifications in its configuration and operation

throughout its deployment, this maintenance procedure

together with stocked spare parts has ensured radar

availability upward of 98%.

3. IP1 radar architecture

a. IP1 radar hardware

The IP1 radar, shown in Fig. 5, integrates transmitter,

receiver, and data acquisition subsystems in a single

assembly mounted directly behind the antenna. The

transmitter employs a magnetron that has some limited

agility on duty cycle and supported waveforms. The

transmitter delivers a peak power of 25 kW at a maxi-

mum duty cycle of 0.1%. The maximum pulse length is

FIG. 3. Radar node architecture: the tower-top rotating assembly

contains the radar antenna, transceiver, data acquisition system,

and elevation actuator, all mounted on a frame on top of the azi-

muth positioner. On the radome floor, the nonrotating subsystems

include a gigabit Ethernet switch, Ethernet-controlled outlet strip,

GPS amplifier, and position controller computer. A gigabit

Ethernet optic fiber links the tower top with the data processing

server on the ground. The site is connected to the Internet through

a radio link.

FIG. 4. Radar node control and weather data flow. User-issued

high-level commands are broken in scanning, transmitter, receiver,

and data acquisition settings. The radar node state and configura-

tion is reported back to the user as part of the radar’s weather data

stream.

64 J O U R N A L O F A T M O S P H E R I C A N D O C E A N I C T E C H N O L O G Y VOLUME 27

Unauthenticated | Downloaded 05/05/22 05:54 PM UTC

Page 5: The CASA Integrated Project 1 Networked Radar System

1 ms, which yields a maximum average power of 25 W at

the maximum pulse repetition frequency (PRF) of 1 kHz.

That would set the system’s unambiguous Doppler ve-

locity to 67.5 m s21. This is clearly insufficient for severe

weather applications, where velocities in excess of 625

m s21 are often encountered. To increase the system’s

unambiguous Doppler velocity and maintain the ability to

efficiently both suppress ground clutter echoes and per-

form spectral processing on the received weather echoes,

blocks of pulses at two higher PRFs (typically 1.6 and

2.4 kHz) are transmitted. To do so, the transmitter peak

power is reduced from its maximum value, which allows

an increase in duty cycle and PRF. Although the adopted

dual-PRF waveform solution is more demanding on the

system hardware, it provides better performance than

a comparable staggered PRT waveform solution in terms

of combined unambiguous Doppler velocity, second-trip

echo suppression, and ground clutter echo filtering, which

are capabilities of core importance in an X-band, short-

range, low-scanning radar. In addition to the dual-PRF

operation at 1.6 and 2.4 kHz, the system admits opera-

tion at other dual-PRF values and at single PRF. A

detailed description of the waveforms’ design in the IP1

system can be found in Bharadwaj and Chandrasekar

(2005, 2006).

The antenna is a center-fed reflector supporting dual-

polarization, mounted on an agile azimuth pedestal ca-

pable of very high scan rates up to 1208 s21. The antenna

motion in elevation is limited to 358, because the system

is intended to scan at low elevations. A linear actuator

is employed, which can achieve a maximum speed of

308 s21. During radar operation, the measurement scans

are taken at a speed of 218 s21, with an antenna repo-

sitioning speed of 608 s21 between measurement scans.

These scan rates enable this mechanically steered sys-

tem to emulate aspects of the beam steering that would

be more typical of electronically scanned antennas.

The dual-channel analog receiver is composed of

a connectorized front end followed by a mixed-signal

surface-mount circuit board that integrates two down-

converting stages and supports advanced features such

as digital frequency and phase control and phase-matched

automatic gain control (AGC). It is a relatively wide-

band receiver (60-MHz bandwidth approximately) when

compared with the radar signal nominal bandwidth of

1.5 MHz. This is justified by the need to accommodate

both the output frequency variability of the particular

magnetron used and the tolerance in nominal output

frequency specification when a used magnetron is re-

placed by a new one. A remotely programmable direct

digital synthesizer (DDS), together with an automatic

frequency control (AFC) software loop, allows the system

to track the transmitted frequency and to control digi-

tally the analog receiver’s final stage down-conversion

frequency. The DDS also allows the two (H and V) re-

ceiving channels phase difference to be set to any arbi-

trary number.

The inclusion of a noise source and dedicated signal

paths enable receiver gain and transmitter power cali-

bration, as well as accurate sampling of the transmitted

pulse, used to determine its amplitude, phase, and fre-

quency. The calibration signal path meets the antenna

path at a high-isolation, low-loss RF switch placed in

front of the low noise amplifier (LNA). When the mag-

netron fires, the switch selects the sampled signal from

the transmit pulse and then switches over to the antenna

path, creating a single signal containing both the trans-

mit pulse sample and the weather echo.

The receiver output is fed into a high-speed, reconfig-

urable, data acquisition and processing system developed

and manufactured by Dynamic Sensing Technologies for

this particular application. The data acquisition system

is a stand-alone unit based on field programmable gate

array (FPGA) technology. The data acquisition system

performs sampling, digital down-conversion, and filter-

ing and generates the low-level radar hardware control

signals. This allows the synchronization of the sampling,

transmitter triggering, and receiver control signals, which

permits subrange gate processing. Figure 6 shows a block

diagram of the transmitter, receiver, and data acquisition

system assembly.

FIG. 5. IP1 radar node: the radar transmitter and receiver (alu-

minum box) and data acquisition (black coated box) are mounted

behind the antenna, the whole assembly being moved in elevation

by a linear actuator. The open-frame rack on the radome floor

contains the position controller computer, a gigabit Ethernet

switch, and an outlet strip. The signal lines from the rack are made

available to the transceiver through the azimuth positioner’s slip

ring assembly.

JANUARY 2010 J U N Y E N T E T A L . 65

Unauthenticated | Downloaded 05/05/22 05:54 PM UTC

Page 6: The CASA Integrated Project 1 Networked Radar System

The data acquisition system is a single board unit in-

tegrating a high-speed (100 MSps), dual-channel, 14-bit

analog to digital (A/D) front end, a data processing core

built on a high-performance FPGA, a real-time data

management and transport core, and a microcontroller-

based embedded Linux system. The data acquisition

system control and data output interfaces are Ethernet

based. In its current configuration, the data processing

core implements a real-time digital in-phase (I) and quad-

rature (Q) demodulation, followed by multistage band

limiting and decimation. It employs a programmable nu-

merically controlled oscillator (NCO) to down-convert

the incoming intermediate frequency (IF) signals, which

are dynamically tuned by the AFC software loop.

A miniature GPS receiver is integrated in the data

acquisition system housing. The main purpose of the

GPS receiver is to accurately time stamp the received

weather echoes, as well as to provide a common but

independent reference clock for all the IP1 radar nodes.

This common clock allows the synchronization of pro-

cesses at different radar nodes without relying on a net-

work connection, and can be used to lock the transmitter

and data acquisition triggering between different radars.

Independently of the GPS-derived time used to tag the

radar data, the radar node computers use the network

time protocol (NTP) to maintain their time. The nomi-

nal characteristics of the IP1 radar nodes are summa-

rized in Table 2.

b. IP1 radar software

The IP1 radar hardware–software interface and ar-

chitecture is presented in Fig. 7. A first layer, labeled as

hardware, contains the different radar subsystems on the

tower top. The software layer contains the software

processes interfacing to the hardware layer, also running

on the tower top, and all other client processes in-

terfacing to the previous set either pulling radar data or

pushing commands to control the radar operation. The

local–remote software interface is realized through trans-

mission control protocol (TCP) and user datagram pro-

tocol (UDP) sockets, which allows the remote software

to be anywhere with Ethernet connectivity. In the cur-

rent configuration, most of the remote clients are run-

ning a computer at the tower base.

The data acquisition control server, residing in the

data acquisition board, accepts commands to configure

the data acquisition operation such as loading and un-

loading configurations for the FPGAs, enabling and dis-

abling the data acquisition and demodulation process,

FIG. 6. IP1 radar node transmitter and receiver system block and functional diagram: items in dashed arrows pointing to blocks indicate

programmable operation parameters.

66 J O U R N A L O F A T M O S P H E R I C A N D O C E A N I C T E C H N O L O G Y VOLUME 27

Unauthenticated | Downloaded 05/05/22 05:54 PM UTC

Page 7: The CASA Integrated Project 1 Networked Radar System

setting the NCO frequency, and setting the number of

range gates. The data acquisition control server also

accepts commands to configure the signals that control

the radar transceiver state, such as (i) transmit pulse

length and triggering waveform, (ii) high voltage mod-

ulator enable and disable, (iii) receiver gain settings,

(iv) transmit pulse sampling, and (v) internal noise source

enabling. The GPS data server, also in the data acquisi-

tion board, gathers time and location information from

an onboard miniature GPS receiver.

The demodulated radar data packets and GPS data

packets are sent over UDP sockets through the slip rings

in the azimuth positioner to the time series data server.

The time series data server gathers (i) the demodulated

radar data packets, (ii) the GPS data packets, (iii) the

current antenna position data packets from the position

control server, and (iv) the current down-conversion fre-

quency values and estimated transmitted frequency data

packets from the automatic frequency control server.

Together, these yield a digital signal data structure con-

taining a radar state header followed by the demodulated

complex echo voltages at horizontal and vertical polar-

izations. The digital signal data server allows straight-

forward dissemination of the digitized time series data

using TCP sockets.

A number of different clients can connect to the cur-

rent radar node and network configuration. The digital

signal view client creates a real-time, A-scope-type dis-

play used mainly for monitoring and debugging pur-

poses. The digital signal save client allows sending the

digital signal data to a high-capacity RAID, from which

the data can be retrieved for offline postprocessing and

algorithm testing. The weather Doppler spectral mo-

ments are currently being calculated locally at the radar

node: The spectral moment data server connects to the

digital signal data stream from the digital signal data

server; after running algorithms for spectral moment

calculation, ambiguity mitigation, clutter filtering, and

signal attenuation correction (Liu and Bringi 2006), the

output is streamed to the network common data form

(NetCDF) file server, where it is packaged into a NetCDF

file. These NetCDF radar measurement files are sent via

the Internet to the system operation control center

(SOCC), where the suite of meteorological feature ex-

traction algorithms, together with user policy–driven

scan optimization, are employed to automatically con-

figure the radar network’s operation. In addition to this

mode of operation, a separate position control client

allows for remote control of the radar scanning.

The modular radar control, data processing, and com-

munications architecture admit variations in the network

topology, effectively allowing any combination of mul-

tilayer centralized and/or distributed network control

and weather information extraction. This flexibility ac-

commodates the extension of the network through the

simple addition of further radar nodes. Also, it makes

the hardware transparent to the network, making it pos-

sible to compose a system consisting of heterogeneous

sensors.

4. IP1 radar operation and important features

The transmitter, receiver, and data acquisition system

functionality is illustrated in Fig. 6, which also shows the

software-controlled operation parameters of each sub-

system. Once the transmitter waveform is set and the

data acquisition process and high-voltage modulator are

enabled, the magnetron’s trigger signal is released. A

switch at the receiver’s front end allows sampling of

the transmitted pulse, as illustrated in Fig. 8; a sample of

the high-power transmitted pulse is coupled through

a dedicated high-isolation path, preventing the distorted

leakage through the receiver protection limiter from

contaminating the transmitted pulse sample signal. This

TABLE 2. IP1 radar node design characteristics.

Transmitter

Type Magnetron

Center frequency 9410 6 30 MHz

Peak power output 8.0 kW (per channel)

Avg power output 12 W (per channel)

Pulse width 660–1000 ns

Polarization Dual linear, H and V

Max duty cycle 0.16%

Antenna and pedestal

Type (diameter) Dual-polarized parabolic

reflector (1.2 m)

3-dB beamwidth 1.808

Gain 38.0 dB

Azimuth scan rate up to 2408 s21

Elevation scan rate up to 308 s21

Acceleration up to 1208 s22

Receiver

Type Parallel, dual-channel,

linear output I/Q

Dynamic range

(bandwidth 5 1.5 MHz)

103 dB

Noise figure 5.5 dB

Data acquisition system

Sampling rate 100 MSps

Dynamic range

(bandwidth 5 1.5 MHz)

108 dB

Data transfer rate 88.3 Mbps

Decimation factor Adjustable

Video bandwidth Adjustable

JANUARY 2010 J U N Y E N T E T A L . 67

Unauthenticated | Downloaded 05/05/22 05:54 PM UTC

Page 8: The CASA Integrated Project 1 Networked Radar System

preserves the transmitted pulse sample integrity, because

it is used to estimate the transmitted power, frequency,

and starting phase.

The resulting signal is down-converted from X band to

a (DDS) programmable IF frequency, and the IF signal is

then sampled by the data acquisition system’s A/D front

end. Prior to digitally down-converting the digitized IF

signal, its actual carrier frequency must be obtained; a

magnetron is a cavity-based, noncoherent signal source,

and therefore its output frequency will change depend-

ing on the particular operation settings, whereas each

transmitted pulse will have a random start phase. These

effects are estimated and used to correct the received

echo signals to ensure the right operation of the radar.

The estimated frequency value is used to set the NCO

down-conversion frequency.

a. Automatic frequency control

A magnetron’s frequency stability is dependent on

factors such as operating temperature, pulsing duty cycle,

and modulator voltage stability. Figure 9 shows mea-

surement results of the IP1 magnetron output frequency

dependence on temperature and duty cycle, illustrating

some clear trends. Figure 9a shows both the duty cycle

influence on the magnetron’s output frequency at a given

temperature and the effect of temperature change on the

output frequency at a given duty cycle. The magnetron

operating duty cycle offsets the output frequency from its

nominal value by a fixed amount, which is found to be

about 2550 kHz per 0.025% duty cycle increment. The

output frequency decreases linearly as the magnetron

operating temperature increases, showing a mean slope

of 2166 kHz C21. Figure 9b shows the effect of chang-

ing between alternating PRFs (relevant to dual-PRF

operation of the radar). A frequency excursion around

200 kHz is observed when alternating bursts of pulses at

1.6- and 2.4-kHz PRF, in which the output frequency

decreases when the PRF is increased. This agrees with

the trend observed in Fig. 9a, because a PRF increase

creates a duty cycle increase, which finally translates in

a lower output frequency. To keep the receiver tuned to

the changing transmitted frequency, a digital AFC al-

gorithm is implemented. The IP1 radar node AFC al-

gorithm is a real-time software loop that uses a digitized

sample of the transmitted pulse to estimate the magne-

tron’s output frequency and sets the receiver DDS and

data acquisition digital receiver NCO to the appropriate

down-conversion frequencies. This fully digital solution

provides nearly arbitrary tuning resolution and a very

fast settling time. The estimated AFC frequency and

FIG. 7. IP1 radar node software architecture: broad lines are for radar transmitted and received weather echo signals.

68 J O U R N A L O F A T M O S P H E R I C A N D O C E A N I C T E C H N O L O G Y VOLUME 27

Unauthenticated | Downloaded 05/05/22 05:54 PM UTC

Page 9: The CASA Integrated Project 1 Networked Radar System

employed NCO and DDS frequencies are reported in the

radar data stream.

The final digital receiver bandwidth, which is pro-

grammable, is set to a value slightly wider than that of

the transmitted pulse bandwidth, allowing for some tol-

erance in the frequency estimation. The frequency esti-

mation algorithm is based on the following: assuming

that the transmitted waveform is described by

s(t) 5 A cos(2p fot 1 u), t 2 [0, Dt], (1)

the NCO is set to the frequency fNCO,

fNCO

51

N �N

i51f

i, (2)

where fi is the frequency that maximizes the expression

fi

maxðDt

0

w(t)s(t)e j2pfit dt

2" #

, (3)

where w(t) is the windowing function

w(t) 51

21� cos 2p

t

Dt

� �h i

, (4)

Dt is the transmitted pulse duration, and N is the number

of transmitted pulses averaged. The frequency search is

initialized by doing a fast Fourier transform and select-

ing the highest power coefficient. The frequency is fur-

ther refined by computing Eq. (3) at lower and higher

frequency values and at a frequency interval slightly

larger than half of the previous frequency resolution

interval. The frequency value that maximizes Eq. (3)

is then selected, and the frequency search is narrowed

by iterating the process. The AFC estimation is done

continuously, but the actual NCO down-conversion

frequency is changed only if the estimated frequency

value departs more than a given tolerance interval from

its current value, as illustrated in Fig. 10. The down-

conversion frequency tolerance factor is set so that the

radar signal always falls inside the final digital receiver

bandwidth. The down-conversion residual frequency error

FIG. 8. Transmitted pulse sampling: a sample of the transmitted pulse is coupled through

a dedicated signal loop into the receiver (gray arrow) prior to switching over to the antenna port

(black arrow).

JANUARY 2010 J U N Y E N T E T A L . 69

Unauthenticated | Downloaded 05/05/22 05:54 PM UTC

Page 10: The CASA Integrated Project 1 Networked Radar System

introduces an extra phase offset, which is naturally

canceled in all the relative phase measurements, such as

Doppler velocity. This ensures operational stability and

prevents excessive retuning resulting from estimation

errors.

b. Doppler spectral moment estimation

Parallel to the AFC process, the digitized data are

down-converted, decimated, and filtered. Then, the trans-

mitted pulse phase is measured and subtracted from the

received echo signal. This standard technique, known as

coherent on receive, enables phase-based measurements

such as Doppler velocity from noncoherent source echoes.

The transmitted pulse start timing is set so that, after

decimation, the pulse peak sample is preserved. This

ensures the highest SNR at this sample, from which the

transmitted pulse phase is obtained.

After coherent-on-receive correction of the digital

signal, the moments of the Doppler spectrum are com-

puted. A detailed description of the employed algo-

rithms can be found in Bharadwaj and Chandrasekar

(2005, 2006). From an operational standpoint, the im-

plementation of the spectral moment calculation algo-

rithms admits real-time variations of the transmitted

waveforms (from single to dual PRF and different PRF

frequencies and integration times), allowing switching

between operation modes without interruption in the

data flow or manual intervention. Similarly, some signal

processing blocks such as the ground clutter filter can be

turned on and off via software (using the spectral mo-

ment control client) without interfering with the normal

operation of the radar, which allows, for example, in-

terleaving of ground clutter scans needed for refractivity

measurements with the normal operation of the radar.

The metadata necessary to track the state of the radar

node during operation are included in the data stream.

For the Doppler spectrum moment calculation algo-

rithms to work properly, the changes in the transmitted

waveform (going from single to dual PRF, separating

first and second PRF blocks in dual PRF, changing

number of pulses and PRF frequency values, etc.) have

to be detected. This is accomplished by reading the re-

ceiver’s reference clock–derived time stamp with which

every data ray is tagged. This time stamp allows the

Doppler spectrum moment calculation algorithms to

estimate intrapulse periodicities with a resolution of

10 ns, permitting the separation and processing of pulse

blocks according to their estimated PRF. In addition to

that, to provide a ‘‘wall’’ clock common to all radar

nodes, a miniature GPS receiver is employed at each

FIG. 9. IP1 radar node magnetron output frequency drift mea-

surements: (a) a laboratory measurement of the magnetron output

frequency as a function of anode temperature. Top line (squares)

is 0.050% duty cycle, middle (diamonds) is 0.075%, and bottom

(triangles) is 0.100%. (b) The transmitter frequency drift vs PRF is

illustrated. The higher output frequencies correspond to a burst of

pulses at 1.6-kHz PRF, whereas the lower-frequency values cor-

respond to a burst of pulses at 2.4-kHz PRF.

FIG. 10. AFC-estimated transmitted frequency (solid line) and

down-converting frequency (DDS and NCO; dashed line) for

a dual-PRF operation example.

70 J O U R N A L O F A T M O S P H E R I C A N D O C E A N I C T E C H N O L O G Y VOLUME 27

Unauthenticated | Downloaded 05/05/22 05:54 PM UTC

Page 11: The CASA Integrated Project 1 Networked Radar System

radar node. The data acquisition system’s waveform gen-

eration software can accept interruptions coming from

the GPS clock, to which the radar node transmitter,

receiver, and data acquisition control waveforms can be

locked. In addition to the reference clock–derived time

stamp (unique to each radar node), a GPS-based time

stamp (common to all radar nodes in the network) is

added to the radar data stream.

Once the data moments are computed, the appropri-

ate radar constant has to be added to the estimated

signal power to obtain a reflectivity value. To accom-

modate variations in the magnetron’s output frequency,

the analog receiver’s bandwidth is maintained several

times wider than that of the transmitted pulse. There-

fore, there is a certain ripple and slope associated to the

‘‘broadband’’ analog receiver’s gain, which will depend

on the particular frequency at which it is evaluated. To

take this effect into account, a tool in the IP1 radar

nodes allows a broadband calibration of the receiver to

be remotely obtained. A known signal from an onboard

noise source is injected in the receiver, and the output is

recorded while the DDS and NCO are swept in fre-

quency. Because the particular DDS and NCO values

employed in down-converting the radar data are avail-

able in the data stream, the appropriate receiver cali-

bration values corresponding to the current frequency

settings are dynamically selected and employed in the

radar constant determination.

c. Receiver and transmitter calibration

To maintain the accuracy of the radar constant and

received signal power values, the IP1 radar nodes com-

bine techniques, such as using a noise source and feed-

ing a sample of the transmitted pulse through the

receiver with the capability of setting the receiver down-

conversion frequencies, to obtain a joint broadband re-

ceiver and transmitter calibration. In particular, the

following procedure is established: first, the analog plus

digital receiver gain G and noise figure NF are estab-

lished for the entire receiver bandwidth using the noise

source embedded in the receiver. Once the receiver gain

is characterized, the transmitted power is obtained by

tracking the power measured through the receiver.

Figure 11 illustrates the signal paths in the combined

transmitter–receiver calibration.

The noise source injects a signal of known excess

noise ratio ENR (dB) into the receiver through the

calibration path with loss L4 (dB), creating a receiver

input signal Photin (dBm). The corresponding receiver

output signal Photout (dBm) is recorded for all DDS and

NCO down-conversion frequencies in the receiver

bandwidth. This procedure is repeated turning the

noise source off and recording the ambient thermal

noise at the receiver output Pambout (dBm). The noise

factor NF (dB), referenced at the receiver calibration

port (receiver switch input in Fig. 11), is obtained from

FIG. 11. IP1 radar node combined transmitter–receiver calibration paths.

JANUARY 2010 J U N Y E N T E T A L . 71

Unauthenticated | Downloaded 05/05/22 05:54 PM UTC

Page 12: The CASA Integrated Project 1 Networked Radar System

the two noise measurements using the Y factor method

(Pozar 1999) as

NF 5 ENR� L4� 10 log

10(10(Phot

out�Pambout )/10 � 1), (5)

and the combined analog and digital gain of the receiver,

referenced at the antenna port, is then obtained as

G 5 Pambout � Pamb

in �NF� L2, (6)

where L2 (dB) is the loss between the receiver and an-

tenna ports and Pambout (dBm) is the ambient thermal

noise power at the receiver input, obtained as

Pambin 5 10 log

10(kTB) 1 30. (7)

Substituting all equations, the analog plus digital re-

ceiver gain referenced at the antenna port is

G 5 Pambout � 10 log

10(kTB)� 30� ENR 1 L

4

1 10 log10

(10(Photout�Pamb

out )/10 � 1)� L2.

(8)

It must be noted that the calibration procedure does

not take into account the return loss at the receiver’s

input switch or any gain difference between the switch

receive and calibration paths, as the switch is assumed

to be well matched and balanced. Once the receiver is

characterized, the transmitter calibration is performed.

An initial measurement of the transmitted pulse power

at the antenna port P calant is obtained with a power meter,

together with the corresponding power measurement

by the AFC algorithm at the data acquisition output

P caldaq. Looking at Fig. 11, the two measurements are

related as follows:

FIG. 12. IP1 radar network scan task example.

FIG. 13. IP1 radar network-adaptive scan pattern (after scan task shown in Fig. 12).

72 J O U R N A L O F A T M O S P H E R I C A N D O C E A N I C T E C H N O L O G Y VOLUME 27

Unauthenticated | Downloaded 05/05/22 05:54 PM UTC

Page 13: The CASA Integrated Project 1 Networked Radar System

P calant 1 L

1� L

35 Pcal

daq 1 Gcal 1 C, (9)

where L1 (dB) is the loss between the transmitter and

antenna ports, L3 (dB) is the loss between the trans-

mitter and transmitted pulse sample duplexer port, Gcal

is the receiver gain [as obtained in (8)] for the particular

frequency at which the calibration is made, and C (dB)

is a constant that accounts for the signal processing

gain difference between the transmitted pulse AFC pro-

cessing and regular echo processing and the signal path

loss between the duplexer and the receiver calibration

port.

When the radar is in operation, the same relation

applies to the transmitted power at the antenna ports:

Ptxant 1 L

1� L

35 Ptx

daq 1 Gtx 1 C, (10)

FIG. 14. KRSP radar node adaptive scan data collected while executing the scan pattern shown in

Fig. 13. All different elevation sector scans and RHI scans are executed during a system heartbeat

(60-s interval between network scan tasks). Data were collected during routine network operation.

JANUARY 2010 J U N Y E N T E T A L . 73

Unauthenticated | Downloaded 05/05/22 05:54 PM UTC

Page 14: The CASA Integrated Project 1 Networked Radar System

where the superscript tx refers to the real-time trans-

mitter frequency during operation. Subtracting (9) from

(10), it can be shown that the current transmitted power

at the antenna port Ptxant can be tracked from the AFC

transmitted power measurement Ptxdaq taking the initial

calibration measurement as reference

Ptxant 5 Pcal

ant 1 (Ptxdaq � Pcal

daq) 1 (Gtx �Gcal). (11)

Based on the previous expressions, the IP1 radar nodes

obtain an updated transmitted power and radar constant

value for every spectral moment data ray.

d. Motion control and scanning

An important feature of the IP1 radar nodes is their

ability to perform arbitrary scans, which enables real-

time network adaptive coordinated scanning. In the IP1

network, the MCC algorithms running at the SOCC

periodically (every 60 s, referred to as the system

‘‘heartbeat’’) create optimal sets of scan tasks that are

fed back to the radar nodes, which continuously listen

and queue such commands. The scan tasks are inter-

preted at each radar node, where the antenna position-

ing control unit accepts scan commands specifying a

start point and an end point (containing both azimuth

and elevation coordinates), together with motion di-

rection and velocity. The radar antenna is moved from

its current position to the new scan starting point

through the shortest path and at its fastest operational

speed (set at 608 s21) and then moves to the scan end

point at the required velocity (typically 218 s21) while

collecting data. Using this capability as a building block,

any arbitrary scan pattern can be constructed through

the noninterrupted execution of a queued sequence of

piecewise linear scans, allowing the radar nodes to in-

terleave different operation modes.

This process is illustrated in Figs. 12, 13, and 14, which

show a typical operational scanning sequence. Figure 12

FIG. 15. IP1 network reflectivity composite data showing the passage of a cluster of rain cells.

Data were collected while executing the scan pattern shown in Fig. 13, during routine network

operation.

74 J O U R N A L O F A T M O S P H E R I C A N D O C E A N I C T E C H N O L O G Y VOLUME 27

Unauthenticated | Downloaded 05/05/22 05:54 PM UTC

Page 15: The CASA Integrated Project 1 Networked Radar System

illustrates the MCC scan task output in response to

detected data features (colored polygons), where the

multiple transparent arcs and different angular widths

represent the different scans to be executed by each

radar. Figure 13 presents the different scans performed

at each node during that heartbeat in more detail. In

particular, one can see how KSAO is tasked to perform

full PPI scans at two different elevations followed by an

RHI, whereas KCYR, KLWE and KRSP are tasked to

perform a sector scan at the lowest elevation, followed

by a full PPI and more sector scans at the next eleva-

tions, and finalizing also with an RHI scan. The different

sector sizes and number of elevation scans per radar

node are intended to optimally cover the weather scene

following the MCC automated feature detection output

and optimization process (Zink et al. 2005). Figure 14

shows the KRSP reflectivity and Doppler velocity fields

corresponding to the scanning sequence in Fig. 13,

where a total of six different scans are obtained in the

60-s heartbeat period.

5. Data examples from IP1 radar network

During the last three years, the IP1 radar network

has collected an extensive dataset, from which data

samples are selected to illustrate the network opera-

tion. Figures 14 and 15, obtained on 24 May 2009,

show a linear cluster of rain cells crossing the network

domain. Figure 15 shows a snapshot of the event as

captured by the network, whereas Fig. 14 shows it as

seen by the KRSP radar node. The azimuthal sector

scans are centered around the areas of higher reflec-

tivity, with a full PPI scan at 28 providing a broader

view. The RHI scan shows the vertical wind shear and

high reflectivity core associated with the targeted rain

cell.

Figure 16, obtained on 14 June 2007, shows network

composite reflectivity data obtained before the passage

of a cluster of storm cells through the network domain.

The leading gust front can be clearly seen extending from

south of the KCYR radar node to KSAO’s north–east,

FIG. 16. IP1 network reflectivity composite data showing the gust front ahead of a storm en-

tering the network domain. Data were collected during routine network operation.

JANUARY 2010 J U N Y E N T E T A L . 75

Unauthenticated | Downloaded 05/05/22 05:54 PM UTC

Page 16: The CASA Integrated Project 1 Networked Radar System

while the storms to the north start entering the network

domain.

Figure 17, obtained on 6 October 2008, shows net-

work composite reflectivity data collected during the

passage of a storm cell. The cell developed a hook echo

at its northwest leading edge, falling under the cover-

age of both KRSP and KSAO. A zoomed-in image of

the hook echo region, containing dual-Doppler vector

winds overlaid on reflectivity, reveals a weak rotation at

the hook echo tip embedded in the general northwest

flow.

Figure 18, obtained on 10 February 2009, shows net-

work composite reflectivity data collected during the

passage of a train of storm cells, which developed a

number of hook echo features and eventually produced

tornadoes once outside of the network domain. One

such hook echo feature can be seen at the southwest

edge of the cell just north of the KCYR radar node. The

dual-Doppler vector winds in the zoomed hook echo

region shows an area of rotation at the hook echo tip.

All the data collected by the IP1 radar network are

routinely archived and used in developing a number of

both real-time and postprocessing research weather ra-

dar and meteorological data products, such as nowcast-

ing and forecasting analysis, rainfall estimation, wind

analysis, and weather feature detection.

6. Summary

This paper presented the IP1 radar system, designed

and developed specifically to operate in a networked

environment. To make a radar suitable for dense net-

worked deployment, a number of challenges must be

solved: the radars have to be small, easy to deploy

and maintain, and controllable from remote locations.

The need to keep the physical size of the radar node

small leads to a higher frequency of operation (X band),

which creates a new set of challenges: the radar’s range–

velocity ambiguity is worse when comparing to the usual

lower frequencies employed in operational weather radar

FIG. 17. IP1 network reflectivity data and KRSP–KSAO dual-Doppler wind zoom on a de-

veloping hook echo in the storm leading edge. Data were collected during routine network

operation.

76 J O U R N A L O F A T M O S P H E R I C A N D O C E A N I C T E C H N O L O G Y VOLUME 27

Unauthenticated | Downloaded 05/05/22 05:54 PM UTC

Page 17: The CASA Integrated Project 1 Networked Radar System

systems. The effects created by this are mitigated

through hardware configurations that support advanced

waveforms. In addition, the system is able to effectively

filter ground clutter as it is intended for operation at low

beam altitude.

The design solutions adopted to mitigate the issues

arising from operation in a networked environment have

been presented, as well as the system implementation

and its operation. The radars operate in real time, under

remote automated control, and generate attenuation-

corrected data, with a high unambiguous velocity in-

terval and reduced second trip and clutter contamination,

making them fit to perform in a dense network. Sev-

eral data cases collected under such an operation are

presented.

Acknowledgments. This work was supported primar-

ily by the Engineering Research Centers Program of

the National Science Foundation under NSF Award

0313747. Any opinions, findings, and conclusions or rec-

ommendations expressed in this material are those of the

authors and do not necessarily reflect those of the

National Science Foundation.

The authors acknowledge Mr. Luko Krnan of Dy-

namic Sensing Technologies for his contributions towards

the integration of the data acquisition solution in the ra-

dar nodes and field work support.

REFERENCES

Bharadwaj, N., and V. Chandrasekar, 2005: Waveform design for

CASA X-band radars. Proc. 32nd Conf. on Radar Meteorology,

Albuquerque, NM, Amer. Meteor. Soc., P10R.13. [Available

online at http://ams.confex.com/ams/pdfpapers/96347.pdf.]

——, and ——, 2006: Waveform design considerations for CASA

radar network. Proc. Fourth European Conf. on Radar Mete-

orology and Hydrology (ERAD 2006), Barcelona, Spain,

Servei Meteorologic de Catalunya et al., 117–121.

Chandrasekar, V., and A. P. Jayasumana, 2001: Radar design and

management in a networked environment. Proc. ITCOMM,

Denver, CO, International Society for Optical Engineering,

142–147.

Junyent, F., and V. Chandrasekar, 2009: Theory and character-

ization of weather radar networks. J. Atmos. Oceanic Tech-

nol., 26, 474–491.

FIG. 18. IP1 network reflectivity data and KCYR–KSAO dual-Doppler wind zoom on

a developing hook echo in the storm-trailing edge. Data were collected during routine network

operation.

JANUARY 2010 J U N Y E N T E T A L . 77

Unauthenticated | Downloaded 05/05/22 05:54 PM UTC

Page 18: The CASA Integrated Project 1 Networked Radar System

Liu, Y., and V. N. Bringi, 2006: Improved rain attenuation cor-

rection algorithms for radar reflectivity and differential re-

flectivity with adaptation to drop shape model variation. Proc.

26th Int. Geoscience and Remote Sensing Symp. (IGARSS

2006), Denver, CO, Institute of Electrical and Electronics

Engineers, 1910–1913.

McLaughlin, D. J., 2001: Presentation to National Research Coun-

cil: Weather Radar Technology beyond NEXRAD. National

Academies Press, 96 pp.

——, and Coauthors, 2005: Distributed collaborative adaptive

sensing (DCAS) for improved detection, understanding, and

predicting of atmospheric hazards. Proc. Ninth Symp. on In-

tegrated Observing and Assimilation Systems for the Atmo-

sphere, Oceans, and Land Surface (IOAS-AOLS), San Diego,

CA, Amer. Meteor. Soc., 11.3. [Available online at http://

ams.confex.com/ams/pdfpapers/87890.pdf.]

NRC, 2002: Weather Radar Technology beyond NEXRAD. Na-

tional Academy Press, 81 pp.

Pozar, D. M., 1999: Microwave Engineering. Wiley, 720 pp.

Zink, M., and Coauthors, 2005: Meteorological command and control:

An end-to-end architecture for a hazardous weather detection

sensor network. Proc. Workshop on End-to-End, Sense-and-

Respond Systems, Applications, and Services (EESR ’05), Seat-

tle, WA, USENIX Association and ACM SIGMOBILE, 37–42.

78 J O U R N A L O F A T M O S P H E R I C A N D O C E A N I C T E C H N O L O G Y VOLUME 27

Unauthenticated | Downloaded 05/05/22 05:54 PM UTC