1
TASK in Redwood Trees
Wei Hong (IRB)Sam Madden (IRB, MIT)
Rob Szewczyk (UCB)Jan. 14, 2003
2
Acknowledgement TASK Team
Alan Broad (Xbow) Phil. Buonadonna (IRB) Anind Dey (IRB) David Culler (UCB, IRB) David Gay (IRB) Joe Hellerstein (IRB, UCB) Alan Mainwaring (IRB) Jaidev Prabhu (Xbow) Joe Polastre (UCB)
3
Outline TASK Overview Status Report Progress in Making TASK Real TASK/Deployment Roadmap The Redwood Tree Deployment Data From the Redwood Trees Calibration Results Conclusions
4
TASK Overview TASK = Tiny Application Sensor Kit,
formerly GSK, ASK. Sensor Network in a Box: rapid sensor
network deployment for non-computer scientists
Tools suite built on top of TinyDB Sensor metadata management Query configuration Network monitoring Data visualization Integration with DBMS and data analysis
tools
5
TASK Architecture
TASK Field Tools
JDBC/ODBC
DBMS(PostgreSQL)
TASK Client Tools
JDBCInternet
TASK Server
Sensornet Appliance
External Tools
TinyDB Sensor Network
6
Status Report Released with TinyOS 1.1!
Install the task-tinydb package apps/TASKApp,
tools/java/net/tinyos/task http://berkeley.intel-research.net/task
Successful deployments in Lab and UCBG redwood trees Largest deployment: ~80 weather
station nodes Network longevity: 4-5 months
7
Progress in Making TASK Real Power Management Time Synchronization Improved Query Sharing Watchdog Improved Routing Layer
8
Power ManagementCoarse-grained app-controlled communication
scheduling
1
2
3
4
5
Mote ID
time
Epoch (10s -100s of seconds)
2-4s Waking Period
… zzz … … zzz …
9
Power Management (cont) Benefits
Can still use CSMA within waking period No reservation required: new nodes can join easily!
Minimal code changes from previous code base without power management
Drawbacks Longer waking time vs. TDMA?
Could stagger slots based on tree-depth No “guaranteed” slot reservation
Nothing is guaranteed anyway Improvement
Adaptively setting waking period (currently hardwired to 4s)
Network size Sensor startup + acquisition time
10
Time Synchronization All messages include a 5 byte time stamp
indicating system time in ms Synchronize (e.g. set system time to timestamp) with
Any message from parent Any new query message (even if not from parent)
Punt on multiple queries Timestamps written just after preamble is xmitted
All nodes agree that the waking period begins when (system time % epoch dur = 0)
And lasts for WAKING_PERIOD ms Adjustment of clock happens by changing
duration of sleep cycle, not wake cycle. If node hasn’t heard from it’s parent for k
epochs Switch to “always on” mode for 1 epoch
11
Improved Query Sharing Viral propagation of query messages
(each query many messages) Compensate for loss of query messages New nodes joining the network
Previous implementation Each node asks for all query messages
whenever a query result with unknown query id is snooped
Jams network! Improved implementation
Query request message contains bitmap of messages already received
Shut up if a neighbor has just requested for the same query
12
Stopping a query Must stop query on all nodes at the
same time, or query rekindles Solution:
Explicitly notify neighbors of “dead” queries
Don’t share “dead” queries
13
Watchdog New watchdog component Timer set to multiples of epoch
duration Watchdog reset every time a data
message is heard during an epoch Watchdog triggers when no data
messages are heard in multiple epochs.
Key: motes always resetable remotely!
14
TASK/Deployment Roadmap Distributions of Do-It-Yourself kits, Q1’04
Stargate-based gateway appliance VB/HTML-based improved GUI tools PDA field tool New generation packaging of motes
Support for high data rate applications, Q2’04 Take over Intel fab vibration monitoring application GGB app?
Evolve into core software infrastructure for all Intel Research pilot projects
HP data center SAP asset tracking Etc.
Port to iMote
15
The Redwood Tree Deployment Collaboration with Prof.
Todd Dawson Collect dense sensor
readings to monitor climatic variations across
altitudes, angles, time, forest locations, etc.
Versus sporadic monitoring points with 30lb loggers!
Current focus: study how dense sensor data affect predictions of conventional tree-growth models
16
Data from the Redwood Trees
08/04 08/05 08/06 08/07 08/08 08/09 08/1040
60
80
100R
elat
ive
hu
mid
ity
(%)
Relative humidity at different heights
1020303440
08/04 08/05 08/06 08/07 08/08 08/09 08/10-30
-20
-10
0
10
20
Hu
mid
ity
Dif
fere
nce
(%
)
Date
17
Redwood trees data
08/04 08/05 08/06 08/07 08/08 08/09 08/1010
15
20
25
30T
emp
erat
ure
( C
)
Temperature at different heights
08/04 08/05 08/06 08/07 08/08 08/09 08/10-5
0
5
10
Date
Tem
per
atu
re d
iffe
ren
ce(
C)
1020303440
18
Towards Gradient analysis
00:00 06:00 12:00 18:00 00:00-0.5
0
0.5
1
1.5
2
2.5
3
3.5
4
Time of day
Tem
per
atu
re d
iffe
ren
ce(
C)
Average temperature difference between top and bottom of the tree
19
Temperature
00:00 06:00 12:00 18:00 00:00-0.5
0
0.5
1
1.5
2
2.5
3
3.5
4
Time of day
Tem
per
atu
re d
iffe
ren
ce(
C)
Average temperature difference between 30 feet and bottom of the tree
20
Relative humidity
00:00 06:00 12:00 18:00 00:00-15
-10
-5
0
5
10
Time of day
Rel
ativ
e h
um
idit
y d
iffe
ren
ce(%
)
Average RH difference between top and bottom of the tree
21
Relative humidity
00:00 06:00 12:00 18:00 00:00-15
-10
-5
0
5
10
Time of day
Rel
ativ
e h
um
idit
y d
iffe
ren
ce(%
)
Average RH difference between 30 feet and bottom of the tree
22
Data from the Redwood Trees
08/04 08/05 08/06 08/07 08/08 08/09 08/1040
60
80
100R
elat
ive
hu
mid
ity
(%)
Relative humidity at different heights
1020303440
08/04 08/05 08/06 08/07 08/08 08/09 08/10-30
-20
-10
0
10
20
Hu
mid
ity
Dif
fere
nce
(%
)
Date
23
The Calibration Process Growth chamber
calibration of temperature, humidity, light against trusted sensor
VLSB rooftop calibration of PAR sensor against trusted sensor
24
Redwood trees data
08/04 08/05 08/06 08/07 08/08 08/09 08/1010
15
20
25
30T
emp
erat
ure
( C
)
Temperature at different heights
08/04 08/05 08/06 08/07 08/08 08/09 08/10-5
0
5
10
Date
Tem
per
atu
re d
iffe
ren
ce(
C)
1020303440
25
Towards Gradient analysis
00:00 06:00 12:00 18:00 00:00-0.5
0
0.5
1
1.5
2
2.5
3
3.5
4
Time of day
Tem
per
atu
re d
iffe
ren
ce(
C)
Average temperature difference between top and bottom of the tree
26
Temperature
00:00 06:00 12:00 18:00 00:00-0.5
0
0.5
1
1.5
2
2.5
3
3.5
4
Time of day
Tem
per
atu
re d
iffe
ren
ce(
C)
Average temperature difference between 30 feet and bottom of the tree
27
Relative humidity
00:00 06:00 12:00 18:00 00:00-15
-10
-5
0
5
10
Time of day
Rel
ativ
e h
um
idit
y d
iffe
ren
ce(%
)
Average RH difference between top and bottom of the tree
28
Relative humidity
00:00 06:00 12:00 18:00 00:00-15
-10
-5
0
5
10
Time of day
Rel
ativ
e h
um
idit
y d
iffe
ren
ce(%
)
Average RH difference between 30 feet and bottom of the tree
29
03:00 06:00 09:00 12:00 15:00 18:000
100
200
300P
AR
( m
ol m
-2s-1
Ground truth data, growth chamber, Dec. 1, 2003
03:00 06:00 09:00 12:00 15:00 18:000
20
40
Tem
per
atu
re (
C)
03:00 06:00 09:00 12:00 15:00 18:000
50
100
150
Time
RH
(%
)
Chamber data -- reference
30
Chamber data – measured
03:00 06:00 09:00 12:00 15:00 18:000
20
40P
AR
( m
ol m
-2s-1
Mote 69 data, growth chamber, Dec. 1, 2003
03:00 06:00 09:00 12:00 15:00 18:000
20
40
Tem
per
atu
re (
C)
03:00 06:00 09:00 12:00 15:00 18:000
50
100
Time
RH
(%
)
31
Temperature measurements
5 10 15 20 25 30 355
10
15
20
25
30
35
Reference Temperature ( C)
Mea
sure
d T
emp
erat
ure
( C
)
Measured Temperature v. reference temperature, chamber Dec. 2003
Mote 69Mote 65Ideal fit
32
Temperature error distribution
-3 -2 -1 0 1 20
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
Temperature difference ( C)
Fra
ctio
n o
f sa
mp
les
CDF of temperature errors
December 2003 chamber run
Bias of 1 C
33
Relative humidity measurements
0 20 40 60 80 1000
10
20
30
40
50
60
70
80
90
100
Reference relative humidity (%)
Mea
sure
d r
elat
ive
hu
mid
ity
(%)
Measured humidity vs. reference sensor
Mote 69Mote 65
34
Relative humidity measurements
-20 -10 0 10 20 300
0.050.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.90.95
1
Relative humidity difference (%)
Fra
ctio
n o
f sa
mp
les
CDF of relative humidity errors
December 2003 chamber run
Errors computed assuming a constant sensor offset of 13.25%
35
PAR Measurements – regression
18:00 00:00 06:00 12:00 18:00 00:00 06:00 12:00 18:00-200
0
200
400
600
800
1000
1200
1400
1600
Time
PA
R ( m
ol m
-2s-1
)Calibrated PAR and best fit measurements
fit=f(par, v, 1/v, par/v)
Calibrated DataFit DataDifference
36
PAR Measurements
0 500 1000 15000
500
1000
1500Measured vs. calibrated PAR
Mea
sure
d P
AR
( m
ol m
-2s-1
)
Calibrated PAR (mol m-2s-1)
fit=f(par, v, 1/v, par/v)
Calibrated vs. Our DataIdeal fit
37
PAR measurements
-200 -150 -100 -50 0 50 100 150 2000
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
Error magnitude (mol m-2s-1)
Fra
ctio
n o
f re
adin
gs
CDF of PAR errors
fit=f(par, v, 1/v, par/v)
38
Still a ways to go…
08/04 08/05 08/06 08/07 08/08 08/09
0
500
1000
1500
2000
Date
Co
nve
rted
PA
R (
mo
l m-2
s-1)Converted PAR data
10203034
39
Conclusions TASK readily available in
TinyOS1.1 Proven to work well in low-data-
rate, environmental monitoring type of applications
Love to get more “customers” Need more developers Have lots of data, love to share
Top Related