Master’s Thesis in Electrical Engineering Muhammad Rashid...

60
1/60 Technical report, IDE0836, January 2008 Remote Surveillance and Measurement Master’s Thesis in Electrical Engineering Muhammad Rashid and Mutarraf Mumtaz School of Information Science, Computer and Electrical Engineering Halmstad University

Transcript of Master’s Thesis in Electrical Engineering Muhammad Rashid...

Page 1: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

1/60

Technical report, IDE0836, January 2008

Remote Surveillance and Measurement

Master’s Thesis in Electrical Engineering

Muhammad Rashid and Mutarraf Mumtaz

School of Information Science, Computer and Electrical Engineering Halmstad University

Page 2: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

2/60

Preface The work would have not been possible without the help and interest from several people that contributed immensely to the success on the thesis, morally and financially. This Master’s thesis is the final step of the Master of Science Degree at the School of Information Science, Computer and Electrical Engineering (IDE-section) at Halmstad University, Sweden. We would like to express our gratitude to our supervisor Prof. Tony Larsson for his huge support and helpful suggestions during the whole duration of the project. We would like to thank Andreas Persson for his valuable help on the technical aspects of this thesis. Finally, we would like to thank our friends and families for their moral support during our thesis. Muhammad Rashid & Mutarraf Mumtaz Halmstad University 2008

Page 3: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

3/60

Abstract Wireless Sensor Network (WSN), a collection of “sensor nodes” promises to change the scientist’s approach of gathering the environmental data in various fields. Sensor nodes can be used for non-stop sensing, event detection, location sensing and local control of actuators, this concept gives surety to many latest application areas like agriculture, military, home or factory automation, logistics and so on. Remote surveillance and measurement missions can be performed by using WSNs. The hot research topic now-a-days is to make such networks remotely controllable and adaptive to the environment and mission. The work carried out in this thesis is the development of a surveillance application using TinyOS/nesC. The purpose of this application is to perform event-detection mission by using any one of the built-in sensor on Mica2 motes as well as a setup protocol is designed to make the WSN remotely controllable and adaptive to the mission. In this thesis, an experimental work is also performed using TinyDB to build up a surveillance system whose purpose is to detect and count the total number of person present at any time in a given room and to view the results at a remote place. Besides these two system applications, a comparative study between TinyDB and nesC is described which concludes that more hardware control can be achieved through nesC which is a more power efficient platform for long-term applications. List of Abbreviations INEEL Idaho National Engineering & Environment Laboratory TAN Test Area North SCADA Supervisory Control and data Acquisition D&D Deactivation and Decommissioning WSN Wireless Sensor Network TinyOS Tiny Operating System TinyDB Tiny Data Base nesC Network Embedded Systems C Tossim TinyOS mote Simulator API Application programming Interface MIB Mote Interface Board XBOW Crossbow PIR Passive Inferred RF Radio frequency SQL Structure Query Language

Page 4: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

4/60

Table of Contents

1 Introduction...................................................................................................................... 6

1.1 Thesis goal ................................................................................................................ 7 1.2 Method .................................................................................................................... 7

2 Background ...................................................................................................................... 8 2.1 Agriculture Applications........................................................................................... 8 2.2 Environmental Monitoring........................................................................................ 9

Weather monitoring.................................................................................................... 9

Fire detection.............................................................................................................. 9 2.3 Military applications ............................................................................................... 10

2.3.1 Enemy surveillance.......................................................................................... 10 2.3.2 Monitoring army, equipment and ammunition ................................................ 10

2.4 Indoor applications.................................................................................................. 10 2.4.1 Home automation............................................................................................. 10 2.4.2 Open secure office ........................................................................................... 10 2.4.3 Environmental control in office buildings .......................................................11

2.5 Remote Surveillance Applications.......................................................................... 11 2.5.1 Remote Surveillance of Facilities Awaiting D&D .......................................... 11

2.5.2 A Multi-sensor System for Remote Surveillance of a Motorway Overpass.... 11

3 Hardware Overview....................................................................................................... 13 3.1 Sensor Board........................................................................................................... 14 3.2 Programming Board................................................................................................ 14 3.3 Sensor Node............................................................................................................ 15 3.4 Base Station ............................................................................................................ 16 3.5 RS-232 Interface ..................................................................................................... 16

4 Implementation Strategies ............................................................................................. 17 4.1 Sensor Node Operating System: TinyOS ............................................................... 17

4.1.1 TinyOS versions............................................................................................... 17 Difference between TinyOS 1.x and 2.x....................................................................... 18

Types of TinyOS Files............................................................................................... 18

4.1.2 TOSSIM........................................................................................................... 18 4.1.3 Cygwin............................................................................................................. 18

4.2 nesC......................................................................................................................... 19 4.3 TinyDB ................................................................................................................... 20

5 Description of Implementation ...................................................................................... 22 5.1 Designed application using nesC............................................................................ 22

5.1.1 System Architecture......................................................................................... 22 5.1.2 Environmental monitoring............................................................................... 23 5.1.3 Event detection................................................................................................. 24 5.1.4 Remotely Controllable..................................................................................... 25 5.1.5 Adaptiveness to Mission.................................................................................. 27 5.1.6 Summarization of Designed Set-up Protocol................................................... 29

5.1.7 Experimental Results ....................................................................................... 30 5.2 Experiments using TinyDB..................................................................................... 33

5.2.1 Uncertainty Problems....................................................................................... 37 5.3 Comparison between nesC and TinyDB................................................................. 38

Page 5: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

5/60

6 Conclusions and Future Work ....................................................................................... 40 6.1 Conclusions............................................................................................................. 40 6.2 Future Work............................................................................................................ 41

7 References...................................................................................................................... 42

Appendix A: A Quick Introduction to Hardware (nesC).................................................. 44

Appendix B: A Quick Introduction to TinyDB ................................................................ 59

List of Figures Figure 1: Hardware Connections and Communications .....................................................1

Figure 2: Sensor Board [2].................................................................................................. 1 Figure 3: programming board ............................................................................................. 1 Figure 4: sensor node [2] .................................................................................................. 16 Figure 5: The Cygwin Interface of nesC application........................................................ 19

Figure 6: System architecture of designed application ..................................................... 23

Figure 7: Packet description including light reading ........................................................ 28 Figure 8: Packet description including Temp reading...................................................... 29

Figure 9: Experimental Setup for Designed Protocol...................................................... 30

Figure 10: Light level vs. Epoch....................................................................................... 31 Figure 11: Temperature vs. Epoch.................................................................................... 31 Figure 12: System architecture using TinyDB.................................................................. 33 Figure 13: Experimental Scenario using TinyDB............................................................... 1

Figure 15: Persons Detection in room .............................................................................. 36 Figure 14: Single Person detection in room...................................................................... 35 Figure 16: Programming Models for Sensor Networks[17] ............................................. 38

List of Tables Table 1: Packet details [1]................................................................................................. 28

Page 6: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

6/60

1 Introduction In recent years, the advances in sensor technology, coupled with low power, low-cost digital signal processors and radio technologies have made possible the developments of low cost sensor networks. Micro-Electro-Mechanical Systems (MEMS) and other advancements in the area of micro electronics enable the integration of mechanical elements, sensors, actuators and electronics on a common silicon substrate through micro fabrication technology and have enabled tiny sensor nodes to communicate in short distance. Wireless Sensor Networks (WSNs) consist of many sensor nodes that are generally deployed in a sensor field and have capabilities to collect and route data back to the base station. Each node consists of an embedded microcontroller with memory, a battery, a wireless transceiver and is equipped with various sensing hardware (light, temperature, etc.). Sensor nodes are intended to be distributed to measure and monitor a physical environment such as tracking objects all the way through an area or measuring environmental conditions such as the intensity of light and temperature in a large area. The tracking of moving objects has been [6] recognized as a well-suited application, which would get advantage from WSN technology. Sensor nodes can use their processing capabilities to carry out simple computations and transmit data if an event is triggered. The use of WSN is increasing day by day and its features permit the sensor networks in numerous fields, like environment, military and security. WSN has the capabilities to adapt itself in response to the environment e.g. it can adapt to support the introduction of new nodes and compensate for node failures. To what extent the adaptation can be made autonomously in the network depends on the application, the availability of energy resources and network strategies. Remote surveillance and measurement missions can be performed by suitably configured WSNs and it has been developed to apply to several fields. A remote surveillance system consists of a data collection and communication system, having sensor nodes with data analysis software and application-specific sensors. Any Personal Computer (PC) can be used for storing and manipulating data. The software plays an important role in the remote surveillance operation, as it is the interface between the physical components and the operator. For the development of surveillance and measurement application in this thesis, the focus is to make the network remotely controllable and adaptive to the mission by using the platform TinyOS/nesC. In our application, the required parameters can be passed through a PC to the nodes to control and manage the network as per required conditions. In the same way, when a developed application is capable of performing more than one mission, there must be a mechanism through which any mission can be controlled and performed remotely according to the requirement. As the application experiment developed in this thesis can perform two missions, one by using a photo sensor and the other by using a temperature sensor, we can in principle perform any mission remotely.

Page 7: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

7/60

1.1 Thesis goal The goal of the thesis is to design and then evaluates a setup protocol for remote and partly autonomous environment measurement as well as surveillance missions in order to achieve the functionalities: • Remotely controllable and manageable.

o To detect events in the monitoring environment.

o To configure any mission functionality wirelessly (remotely) for adapting the network to execute that mission, in short, the network has to be mission adaptive.

The main constraint that is dealt with while designing the setup protocol is the energy consumed by the network because the sensor nodes are battery driven and it is very unlikely to be able to recharge or replace the batteries after depletion. • To compare the measurement setup protocol approach with TinyDB in the form of the pros and cons.

1.2 Method The methods adopted to achieve thesis goals are as follows:

• The basic WSN limitations and capabilities are examined. • Study TinyOS/nesC and TinyDB for the design of the remotely controlled event-

driven applications, which are appropriate to control and supervise the WSNs.

• After getting understanding, a prototype is made and evaluated using TinyOS/nesC platform.

• The comparison between nesC and TinyDB is precisely described.

• The results are finally presented through graphical representation by using

MATLAB.

Page 8: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

8/60

2 Background The recent developments in MEMS and wireless communication technology make WSNs more interesting and have opened up a new horizon in the domain of networking. Sensor nodes contain components for the sensing, processing, and communication of data. A large number of sensor nodes generate the sensor networks and their collaborative effort enhance the idea of sensor network A sensor network is designed to detect [9] events or phenomena by collecting and processing data and then to transmit the required information to interested users. Basic features of sensor networks are.

• Self-organizing abilities to communicate with their [11] neighbor nodes • Short-range communication and multi hop routing • Mechanisms for the handling of topology changes due to node failures and fading • Energy, transmit power, memory and computing power are limited • Dense deployment and cooperative effort of sensor nodes

Sensor nodes can be used for non-stop sensing, event detection, location [10] sensing and local control of actuators. This concept assures many of the latest application areas. The applications of WSN can be classified for example in the military, environment, health, and commercial areas.

2.1 Agriculture Applications The agriculture sector is the major pillar of Pakistan’s economy and the majority of Pakistan’s population depends on the agricultural sector, either directly or indirectly. The agriculture contributes nearly 22% to Gross Domestic Product (GDP) and employs about 42% of the population. Wheat, potato, rice, cotton and tobacco are the chief crops and dairy forms are also raised in large numbers. It is therefore extremely important to be able to control diseases that pose a threat to agriculture production. Phytophtora is a fungal disease on the potato which can enter the field through a variety of channels. The development and associated attack on the crop depends strongly on the climatologically circumstances within the field. Humidity and temperature are vital reasons in the development of the disease. We want to monitor when the crop is at the risk of developing the disease and let the farmer treat the field or parts of it with fungicide only when this is required. In the same way, monitoring grain temperature during storage provides a good management tool for quality control as temperature varies in the storage place from one place to another. Temperature is useful for determining whether aeration needs to control excess moisture which causes microbial growth, sprouting and germination. It is also useful to determine the best aeration times to control insect population growth or to achieve insect mortality. In the dairy sector, the cow is a vital dairy animal. About more than 50% of the milk and 80% of the meat consumed in Pakistan comes from the cow. It plays a hugely significant role in the economy of small holders in the rural areas. According to the latest assessments in Pakistan, 25% of all new calves are affected by temperature between the ages of one to six months. The reason is

Page 9: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

9/60

that there is no control of the temperature in the shed, which affects the new calves as they are sensitive to temperature changes. The Agricultural practice is based on detailed information on the status of crops and soil. Most of this information focuses on techniques like soil mapping, yield mapping and remote sensing, which cover the spatial domain with more or less spatial resolution. The information is incidentally sampled and is therefore valid at the time the observations are made. Some of the processes like fertilization and especially crop protection require frequent information updates. Sensor systems can deliver such information. Wireless sensor networks, if implemented adequately, have the potential to target diseases with limited resources such as pesticide and fertilizer. Such networks also have the potential of bridging spatial data gap thus empowering policy makers with more effective tools for risk assessment and decision making. For Example, to control and monitor potato crop, a number of sensor nodes according to crop area can be deployed to collect temperature and relative humidity. Afterwards, the information obtained by these nodes on PC, a decision support system can be developed about appropriate action corresponding to the crop state.

2.2 Environmental Monitoring In WSN the deployment of sensor nodes [10] in agriculture and food industry is still in the opening phase. Applications related to agriculture can be classified as follows:

Weather monitoring

A WSN application in a vineyard in BC, Canada reported by the Discovery Channel (2003). Sixty-five sensor nodes were deployed in a one acre land area to report remotely moisture, temperature and sunlight intensity to a PC every 5 min. The landlord could easily observe each area of the vineyard in real time to avoid frosting, manage irrigation, determine fertilizer applications and arrange the harvest schedule. A remote application server relayed data from the sensor network to local users via a WLAN and remote users via the internet and cellular networks.

Fire detection

A large amount of sensor nodes densely [10] deployed in a wide forest and agriculture area. The sensor nodes report data to the central station when they detect the fire. These applications need the real time communication before the fire spreads and becomes unmanageable. Each mote has adequate power, radio communication and processing capabilities to support location, thermal sensors and data handling.

Page 10: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

10/60

2.3 Military applications The initial WSN is to be used in the military applications. Since sensor [9] nodes are low-cost, the damaging of several nodes by hostile actions in the battlefields may not influence a military operation. The abilities regarding toughness features of toughness, fault tolerance and self-organization make sensor networks suitable for military use. Distributed sensing has the advantage that it is competent to offer unneeded and thus extremely dependable information on pressures [10] as well as the ability to locate threats by both coherent and incoherent processing among the distributed sensor nodes. Military applications of sensor networks include enemy surveillance and the monitoring of army, equipment and ammunition.

2.3.1 Enemy surveillance

WSN can be used to deploy a heterogeneous group of sensors competent of monitoring and reporting on different dynamic properties of [9] [10] critical terrains in a timely fashion. Data reports from areas of the sensor network will be aperiodic and various, carrying a range of application exact data.

2.3.2 Monitoring army, equipment and ammunition

Every troop, vehicle, equipment, and critical ammunition is connected with sensors. In the battlefield, the chief can monitor the situation of [9] their armies, equipment, and ammunitions from reported data which are created frequently by sensors and sent to the commanders.

2.4 Indoor applications Indoor applications could for example be office, home and open secure office applications.

2.4.1 Home automation

The sensor nodes could be connected to electrical [9] appliances such as a refrigerator, an oven computer and an air-conditioner. These sensor nodes communicate with one another or with the users who exist outside this network. The users can manage these appliances through a satellite or internet network.

2.4.2 Open secure office

Sensor nodes are connected to expensive equipments [9] such as printers, laptops, PDAs and so on. Each room has a local base station and a major base station is installed at the entrance door. When somebody attempts to move the equipment, the sensor at that equipment, which could be a vibration sensor, forwards an alarm to its local base station which will forward a command to the camera to get a picture of that room. The major

Page 11: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

11/60

base station also detects an alarm and sends a command to lock the door automatically when the equipment is moved close to the entrance door.

2.4.3 Environmental control in office buildings

The heat and air conditioning of many buildings are centrally controlled. Thus, the temperature within a room can differ by a few degrees; there is only one control in the room and the air flow from the central system is not evenly distributed, which explains why one side might be warmer than the other. To control the air flow and temperature in different parts of the room, a distributed wireless sensor [9] network system can be installed.

2.5 Remote Surveillance Applications

2.5.1 Remote Surveillance of Facilities Awaiting D&D

A remote surveillance system designed, tested and deployed at Idaho National Engineering & Environment Laboratory (INEEL) that observed the Test Area North TAN- 616 ability for water in sumps, tanks and on the floor. The existence of water is a symbol that the facility is not [7] controlled and that the risk of pollution avoidance is enlarged. This system replaces that required sending inspectors into the facility with radiation control personnel to check for water and reduces radiation exposure and raises the security of these personnel. The Supervisory Control and data Acquisition (SCADA) approach was used in this remote surveillance system. SCADA consists of obtaining data from sensors, transmitting data to a central site, displaying and communicating data to the operator, and controlling the sensors, actuators and other instruments

2.5.2 A Multi-sensor System for Remote Surveillance of a Motorway Overpass

The surveillance system must detect moving objects, localize and recognize them as well as interpret their behavior in order to for example prevent dangerous situations or more specifically: a car stopped on the overpass road for a time exceeding a [8] fixed time-out, one or more pedestrians moving on the overpass area of a motorway for a long time. A surveillance system for detecting dangerous situations on motorway overpasses has been presented. To detect dangerous situations and advise a remote operator with an alarm signal is the objective of the system. The system which processes optical and infrared images is not only able to detect, track and recognize mobile objects but also to understand the dynamic scene and detect some dangerous events. In the last five years, the problem of the visual inspection of outdoor environments such as for example airports, railway stations and roads has received growing attention. Conventionally the vital tasks of surveillance and safety monitoring are based on human visual observation. However an autonomous system that is able to detect irregular or dangerous situations can help a human operator even if it cannot completely replace its attendance. In particular, such a system involves the real-time analysis and the description of complex scenes from image sequences. However, it is limited to cases where the scene-structure

Page 12: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

12/60

objects and their expected behavior are known. This system, which combines spatio-temporal region and contour-based likelihood methods, shows good detection and tracking capabilities. However, the high computational complexity and the need of a-priori information about the number of objects present in the scene limit its application to real cases. The system uses an artificial neural network to perform both object classification and scene understanding.

Page 13: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

13/60

3 Hardware Overview This chapter illustrates the hardware components that are used in this thesis. We used Mica2 mote, which was designed at University of California, Berkeley [2] [12] and then manufactured and marketed by crossbow technology (XBOW). XBOW developed different motes like: Mica, Mica2, Micaz and Mica2DOT. Mica2 mote is one of the most popular and commercially available motes and most of research is carried out on these types of hardware. Each mote is composed of a microcontroller, a radio transceiver, a sensor board and power supply.

In a wireless sensor node the choice of processing microcontrollers are important and power consumption is the key factor in the selection of microcontroller unit. The selected microcontroller unit must be able to sleep whenever possible. Mica2 is based on an

Figure 1: Hardware Connections and Communications

Page 14: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

14/60

ATmega128L, which is a low power processor and a 8 bit Microcontroller. It consumes 8mA in running mode and only 15µA in sleeping mode. It contains 128KB of onboard flash memory in order to store the mote’s programme. Mica2 mote uses the ChipCon model CC1000 single chip radio transceiver designed for low power which can transmit at 315,433 or 915 MHZ. The Mica2 motes can be powered by two AA batteries with a capacity for 200mA-hr. The figure 1 illustrates a simple WSN that consist of two sensor nodes (node1 and node2) and a base station that is connected to the PC for displaying data through RS-232. The sensor nodes collect data (temp, light etc.) from the immediate surroundings and send it directly to the base station which in turn forwards it to the PC.

3.1 Sensor Board A Sensor is an electronic device that can be used to measure or detect real-world circumstances such as motion, heat or light and convert the condition into an analogue or digital representation and act as a data source that report the state of the environment. The figure 2 illustrates the basic components of the sensor board which is used in our thesis.

3.2 Programming Board The Mica2 motes are usually programmed using MIB510 programming and serial interface provided by XBOW technology. MIB stands for Mote Interface Board. The

Figure 2: Sensor Board [2]

Page 15: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

15/60

MIB is connected to the COM port in the PC with a data rate of 57600bps and communicates between a PC and a sensor network. A user can download a nesC application onto the MICA2 processor board. The basic components of the programming board MIB510 used in our work are shown in figure 3.

3.3 Sensor Node

A WSN consists of nodes that are capable of performing some processing, gathering sensory information and communicating with other connected nodes in the network. Each node in a sensor network is typically equipped with one or more sensors, a radio transceiver or other wireless communication devices, a small microcontroller, and an energy source, usually a battery. Sensing units are generally composed of two subunits: sensors and ADCs. The sensors produce analog signals that correspond more or less linearly to the observed phenomenon and can be converted to digital signals by the ADCs and then fed into the processing unit. The processing unit, which is usually connected with a small storage unit, manages the procedures that make the sensor node work together with the other nodes to carry out the assigned tasks. A transceiver unit is used to connect the node to the network. The power unit is the vital component of a sensor node. The fundamental parts of a sensor node are shown in figure 4. Node Interface

• Connector

Figure 3: programming board

Page 16: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

16/60

51 pin MICA2 interface

• Indicators

MICA2 LED’s: Green, Yellow, Red

Figure 4: sensor node [2]

3.4 Base Station The sensed data must be gathered and transmitted to the base station, which acts as a sink node. The location of base station can influence the network performance. Forwarding data to the BS is possible using single hop or multi hop communication.

3.5 RS-232 Interface

• Connector

• 9-pin “D”

• Baud Rates

For MICA2: User defined (57600bps)

Page 17: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

17/60

4 Implementation Strategies This chapter demonstrates the software components which are used for a WSN. We used some of these components to design our application.

4.1 Sensor Node Operating System: TinyOS TinyOS is an open source [13] development environment that combines sensing, communication and computation into single architecture. TinyOS [5] is an event based operating system designed to support small and meager embedded network nodes. Initially it was developed by university of California, Berkeley to explore the software support for this emerging area of networked sensor. TinyOS is designed specifically for WSN and is motivated by four main goals in mind which are:

• Power management

• Limited resources

• Flexibility

• Concurrency TinyOS provides components and interfaces for common abstractions such as routing, sensing and storage and offers a number of system components that can be reused in numerous applications. The TinyOS [14] system, libraries and applications are written in a network-embedded system language known as nesC, a new language for programming structured component-based applications. TinyOS maintains a two-level concurrency scheduling structure that is known as events and tasks. The task scheduler uses a non-preemptive simple scheduling policy that is known as FIFO, but interrupts may preempt tasks (and each other), but not for the duration of atomic sections which are implemented by disabling interrupts.

4.1.1 TinyOS versions

TinyOS uses a pattern of advanced-level programming conventions related to the one used in Java, with examine the TinyOS interfaces and components for the OS reliability. TinyOS has two stable versions.

• TinyOS 1.x

• TinyOS 2.x

Page 18: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

18/60

Difference between TinyOS 1.x and 2.x

The main difference between 1.x and 2.x is that TinyOS 2.x contains layer 2 source addresses in the packets whereas 1.x does not. If you modify the 1.x stack to some extent to contain source addresses (containing the bits in the FCF field), it must be acceptable.

Types of TinyOS Files

• Interfaces (xyz.nc)

• Module (xyzM.nc)

• Configuration (xyzC.nc)

4.1.2 TOSSIM

It is convenient to be able to simulate the applications on a computer to confirm that [15] the implemented applications work in the proper and expected way. It is easier to overlook and analyze the network in a simulator because a good simulator can be even better than performing tests in reality with ordinary hardware nodes. It would also be almost impossible to simulate hundreds of nodes in reality because of the space required and the high costs. TOSSIM is such a simulator intended to complement and be compatible with TinyOS version 1.x and the AVR ATmega128 microcontroller. Features have been pre-programmed into the simulator to act as a virtual hardware abstraction for MICAZ. For TinyOS, TOSSIM is a simulator and has been designed to capture the following properties.

• Bridging: Micaz mote platforms are presently the only hardware abstraction that the present version of TOSSIM supports. To compile application codes to TOSSIM for specific hardware as needed with easy deployment.

• Scalability: it has the capability to simulate a large number of nodes at the same

time; else it would be unfeasible to simulate a whole network. • Completeness: The simulation covers as many system behaviors as are feasible.

• Fidelity: The simulator is able to capture the behavior of the nodes in detail.

4.1.3 Cygwin

Cygwin is a gathering of free software tools, originally developed by Cygnus Solutions that provides a Linux environment for Windows. It aims mostly at porting software that runs on POSIX systems such as Linux systems, BSD systems and UNIX systems to run

Page 19: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

19/60

on Windows with little more than a recompilation. Programs ported with Cygwin work excellently on Windows NT, Windows 2000, Windows Server 2003, Windows XP and some might even run acceptably on Windows 95 and Windows 98. The installation is downloaded from a local folder and runs from the Cygwin installation files of the directory structure of TinyOS version 1.x and TinyOS 2.x. Cygwin have to be configured in the location variables with all the subdirectories included for TinyOS in order to function. The interface of Cygwin used for the determination of the program memory of the applications is examined in figure 5.

Figure 5: The Cygwin Interface of nesC application

4.2 nesC

nesC (network embedded systems C) is an event-driven programming language for WSN derived from the C programming language. It is designed to embody the structuring concepts and execution model of TinyOS as well as to build applications [3] [4] for this platform.

A nesC application is composed of one or more components interconnected to make it executable. A component provides and uses interfaces; the provided interfaces are intended to represent the functionality that a component provides to its users while the used interfaces represent the functionality the component requires to execute its job. These interfaces are the only access points to the component and are bi-directional; data flows in both ways between components and are linked through the same interface. An interface specifies a set of functions called commands that are implemented by the interface’s provider and another set of functions called events that are implemented by the interface’s user. To call the commands for a component in an interface, it is necessary to implement the events of that interface. This permits a single interface to represent a complex interaction

Page 20: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

20/60

between components (e.g., the registration of interest in a number of events, followed by a call back when that event occurs). This is critical because all extensive commands in TinyOS (e.g. send packet) are non-blocking; their completion is signaled through an event (send done). By specifying interfaces, a component can not call the send command except when it offers an implementation of the sendDone event. A single component may use or present many interfaces and many instances of the same interface. In general commands call downwards, i.e., from application components to those closer to the hardware, while events call upwards. Events are bounded to hardware interrupts having certain primitive. In nesC there are two types of components which are known as modules and configurations. Modules provide application code to implement one or more interfaces. The components are collectively assembled through configurations. Components use different connecting interfaces for interfacing that are provided by others. This process is known as wiring. Inside every nesC application, components which are described by a top-level configuration are wired together. Hardware event handlers and run-to-completion tasks make a concurrency model of nesC which may interrupt tasks and each other. Tasks are functions whose execution is delayed. After being scheduled they run until completion without interrupting each other. Hardware event handlers are executed in response to a hardware interrupt and also run to completion, but may preempt the execution of a task or another hardware event handler. Commands and events that are executed as part of a hardware event handler must be declared with the async keyword. nesC programs are liable to certain race conditions because tasks and hardware event handlers may be preempted by another asynchronous code. Races are avoided either by accessing shared data exclusively within tasks, or by having all accesses within atomic statements. At compile time the nesC compiler reports potential data races to the programmer. It is probable that the compiler may report a false positive. In this case a variable can be announced with the keyword that is known as norace. The norace keyword must be used with extreme caution.

4.3 TinyDB A query-processing system for extracting information from a network of TinyOS sensors is recognized as TinyDB. TinyDB frees you from the burden of writing an embedded [16] nesC code for sensor devices and present a simple SQL-like interface. To identify the data you want to extract along with extra parameters like the rate at which data should be revived much as you would pose queries against a traditional database. Given a query specified to get your required data, TinyDB gathers that data from motes in the surrounding environment, filters it, aggregates it together, and route it out to the PC where the requested application is hosted. TinyDB performs this task via power efficient processing algorithms in the network. To use TinyDB, you install its TinyOS components onto each node in your sensor network. TinyDB gives a simple Java API for writing PC applications that query and extract data from the network. It also comes with a simple graphical query builder and a result show that uses the API. The major goal of TinyDB is twofold: to make your life as a programmer considerably easier and to permit data-driven applications to be developed and deployed much more quickly than what is currently possible. Some of the descriptions of TinyDB include:

Page 21: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

21/60

• Metadata Management

TinyDB gives a metadata catalogue to explain the attributes and commands that are accessible for querying and invocation in the sensor network. Commands and attributes can be created through the Tiny Schema components in TinyOS. Commands can range from parameter tuning to physical actuations. Attributes can be sensor readings or internal hardware software parameters.

• High Level Queries TinyDB uses a declarative query language through which you just have to describe only the data you need, exclusive of concerning the way to obtain it. This makes it easier for you to write applications and gives surety that your applications continue to run well as the sensor network changes.

• Network Topology: TinyDB handles the underlying radio network by tracking neighbors retaining routing tables and certifying that each mote in the network can capably and (comparatively) dependably deliver its data to the user.

• Multiple Queries: TinyDB permits multiple queries to be run on a similar set of motes at the similar times. Queries can have dissimilar sample rates and access different sensor types and TinyDB well divides work between queries when feasible.

• Incremental Deployment via Query Sharing: You simply download the standard TinyDB code to new motes and TinyDB does the rest to increase your TinyDB sensor network. TinyDB motes share queries with each other when a mote listens to a network message for a query that it is not running up till now. It automatically enquires the sender of that data for a copy of the query and begins running it. No programming or configuration of the new motes is vital aside from installing TinyDB.

Page 22: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

22/60

5 Description of Implementation This chapter mainly focuses on the implementation part of the thesis and is divided into three sections. The first section mainly deals with the designed setup protocol; the second section describes the experiments under a practical scenario using TinyDB while the third section precisely illustrates the difference between nesC and TinyDB as programming models.

5.1 Designed application using nesC

The first phase overviews the system [1] [2] architecture and the designed setup protocol using the node level language nesC and its implementation on Mica2 motes. With the developed protocol, the WSN network is controlled remotely without human intervention and it is adaptive to the mission. The environment is monitored through event-detection by using a built-in-sensor, such as for example light, temperature etc. and by assigning appropriate threshold values according to the mission requirements. This setup protocol closely observes the monitoring environments through remotely controlled measurement and event- detection processes which fulfill the requirement for a system to become a surveillance system. In short, the system performs as a remote surveillance system.

5.1.1 System Architecture

The system architecture given below shows the overall picture of the hardware-software description and their interaction in the form of a block diagram. The blocks with solid lines depict the hardware while the blocks having dotted lines illustrate the software. The solid arrows represent wired links while the dotted arrows show wireless links. In the block diagram, two scenarios are combined to run and check the application (MyApp.nc). In the first scenario, one sensor node (MyApp.nc) and sensor board are attached to a programming board which is connected to the PC through a serial interface. There is no base station in this scenario. In the second scenario, one sensor node (TosBase.nc) is connected with the programming board which acts as a base station while the other sensor nodes (MyApp.nc) with sensor boards deployed in the monitoring environment are wirelessly connected to the base station. In both scenarios we can visualize the sensor readings from the PC through built-in java applications.

Page 23: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

23/60

Figure 6: System architecture of designed application

5.1.2 Environmental monitoring

We used TinyOS-1.x version and nesC language to develop the application. In the first step, an application is built-up named ‘MyApp.nc’ that reads data (one reading) from the photo sensor in a continuous manner with fixed sampling interval and sends this data in the form of packet either towards the serial port (over the UART) or towards base station (over the RF) depending on the test scenario. The application has two components; the configuration component named MyApp.nc and the module component named MyAppM.nc. For displaying data on a PC coming from serial port, a built-in java tool ‘Listen’ located at opt/tinyos-1.x/tools/java/ is used. The listen program simply prints out the raw data of each packet received from the serial port on cygwin window in little-endian format. This application is run and checked on sensor nodes in both scenarios;

Parameters

Start Stop Light, Temp, etc Threshold Sampling interval Group id Node id

Sensor Nodes

Sensor Node Base Station

TosBase.nc

MyApp.nc/UART

MyApp.nc/BCAST ADDR

Graphical User Interface Cygwin

PC

Page 24: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

24/60

• Over the UART (using a single mote)

• Over the RF (using two motes with one acting as a base station and the other as a network node)

The application’s module component code is almost the same in both scenarios except with one line difference. When testing the application using the 1st set-up, the code line is: if (call DataMsg.send( TOS_UART_ADDR, sizeof(struct OscopeMsg): &msg[currentMsg]))

While testing the application using the 2nd set-up, the code line will be; if (call DataMsg.send( TOS_BCAST_ADDR, sizeof(struct OscopeMsg), &msg[currentMsg]))

In the command line the first parameter basically decides where the message packet should be sent. In first case, the parameter is ‘TOS_UART_ADDR’ which tells the communications component to send the packet through the UART while in the second case it is ‘TOS_BCAST_ADDR’ which sends the packet over radio. For the rest of this phase the second set-up is used wherever it is needed. For more details see appendix A.

5.1.3 Event detection

The next step is to make the application for detecting the event which means that data is needed only when an event occurs in the surrounding environment. In the monitoring area light values are continuously measured and gathered on the PC but in the case of event detection, reporting data will be sent towards the base station. However, this only happens when the light level goes up or down (depending on the requirement) from a fixed set value called the threshold. To achieve this step the ‘if’ statement is used after getting the data value as code is listed below; 1) async event result_t ADC.dataReady(uint16_t data ) { 2) if (data > 0x0258){

Stuffs the sensor reading into the packet and then sends off the packet.

}}

Line 1 corresponds to the event of ADC interface which is signaled when data is ready after the light sensor sampling. Line 2 is an ‘if’ statement for checking whether the data is above or below (by using greater than or less than sign) than a given threshold value which can be assigned according to the requirement and after the positive decision data is sent towards the base station in packet form, otherwise not. With different threshold values we succeeded through testing that our application can detect the light event. As an

Page 25: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

25/60

example in line2, the ‘if’ statement has a fixed threshold value 600 (decimal) or 0x0258 (hexadecimal) with greater sign.

5.1.4 Remotely Controllable

To control the WSN remotely, there are two parameters in the ‘MyApp’ component: the threshold value and the sampling interval. These two parameters have to be set manually in the application every time when there is a need to change them according to the requirement. After each change the motes have to be programmed and deployed again in the monitoring environment. It is not possible to start and stop the application programmed in the motes from the PC unless the tasks are manually performed. The next step is to control the developed application remotely so that it can start, stop and also pass two parameters from the PC. To start and be able to pass measurement setup parameters (threshold and sampling interval) remotely the ‘Sensing’ interface is used. The coding structure for this interface is given below; 1 interface Sensing { 2 command result_t start(int nsamples, int interva l_ms); 3 event result_t done(); }

This interface can be used for a component that senses data at a certain sampling interval and scale which provides the start() command (Line2) to initiate a series of sensor readings and signals the done() event (Line1) when sensing has completed. The start() command accepts two parameters for initialization; one is the nsamples which is the number of samples to gather while the other is interval_ms the interval (in msec) at which to gather samples. We embedded the ‘Sensing’ interface in the ‘MyApp’ application and utilized the first parameter of the start() command as a threshold value, while the second is the same for sampling interval. We used the built-in java application ‘BcastInject’ tools/java/net/tinyos/tools/BcastInject to inject a command packet into the sensor network from the PC for the ‘Sensing’ interface with appropriate arguments. The sensor nodes are programmed and deployed for testing after embedding the sensing interface. However, even after turning on, the nodes do not start sensing unless BcastInject is run to inject command packet with arguments in cygwin window as: java net.tinyos.tools.BcastInject start_sensing <th reshold> <interval>

Where threshold is the value for event detection (say, 500 or 600), and interval is the time in milliseconds between samples (say, 125). For example:

Page 26: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

26/60

java net.tinyos.tools.BcastInject start_sensing 600 125

After this command which has three controlling instructions: start the nodes (start_sensing ), threshold value (600 ) and sampling interval (125ms), the nodes start sensing and send data towards the base station when threshold occur which in turn is displayed on cygwin window by running the ‘Listen’ tool. Now the application can be started remotely but it can not be stopped when we need to close it for some period of time according to the monitoring requirement. For this purpose we defined a stop argument in BcastInject. After this, the nodes can be stopped by running the command on Cygwin window as; java net.tinyos.tools.BcastInject stop

the motes will stop sensing because after running this command the timer stop command in the ‘MyApp’ module component will activate. The code scheduling after embedding the Sensing interface and the stop command is overviewed as; 1 switch (cmd->action) { 2 case START_SENSING: 3 call Sensing.start(cmd->args.ss_args.nsamples, cmd->args.ss_args.interval); 4 break; 5 case STOP: 6 call Timer.stop(); 7 signal Sensing.done(); 8 break; }

Two types of command packets are injected towards the motes. The ‘MyApp’ application waits, interprets and does the correspondent action through a switch statement (Line1). If the command packet START_SENSING is sent with two arguments, Line2 to 4 will activate which invokes the start command of the Sensing interface by passing these two parameters (threshold value and sampling interval) stated below; command result_t Sensing.start(int samples, int i nterval_ms) { thresh = samples; call Timer.start(TIMER_REPEAT, interval_ms); return SUCCESS; }

The execution of this command starts the timer to indicate at the given sampling period (interval_ms ) in order to generate periodic events. After each timer is indicated the event stated below is signaled; event result_t Timer.fired() { return call ADC.getData(); }

Page 27: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

27/60

When this timer event fires, ADC.getData() is invoked to get a light sensor reading which in turn initializes the ADC.dataReady() event when the data is ready, as stated below; async event result_t ADC.dataReady(uint16_t data) { if (data > threshold){

Stuffs the sensor reading into the packet and then sends off the packet.

} }

Here the data value is compared with the threshold value (threshold ) which is passed through the PC within start sensing command and after a positive decision it is send off in packet format. However, when the command packet STOP is sent the switch statement line5 will activate which in turn invokes the timer to stop command and this stops the timer, preventing it from firing again.

5.1.5 Adaptiveness to Mission

The next and final step is to embed all the available sensors in the application to make it mission adaptive. This time the application can perform only one task by using a light sensor and this application must be flexible so that any other sensor can be used according to the task requirement at run time through remote control. We only implemented temperature sensor in our application while the other sensors can be implemented in the same manner. To achieve this step, two instances of provided ‘Sensing’ interface are used in the application developed; one for a photo sensor and another for a temperature sensor. The same changes are made in BcastInject java tool with regard to both sensors. After embedding the temperature sensor, the sensor nodes are programmed and deployed in the monitoring environment but no sensor node will start sensing unless a suitable command packet is sent towards the nodes. Suppose someone wants to acquire light readings greater than some threshold, a command packet working as an instruction is passed to the sensor nodes to start the application for performing this task using a light sensor with given threshold value and time interval respectively, stated below; java net.tinyos.tools.BcastInject photo_sensing 500 500

The nodes start sensing using a light sensor and the values are sent towards the base station when an event occur as the light value goes higher than the passed threshold value. The values are displayed on Cygwin window using the ‘Listen’ tool as shown below:

Page 28: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

28/60

Figure 7: Packet description including light reading

The node’s packet description coming from the node is given below with regard to the light reading:

Destination Address

Handler ID

Group ID

Message length

Source address

Counter Epoch Readings

FF FF 01 7D 08 01 00 01 00 36 00 74 03

Table 1: Packet details [1]

Data received from the sensor node is in little-endian format; here the two data bytes 74 03 represent a sensor reading with most significant byte 0x03 and least significant byte 0x74. That is, 0x0374 or 884 decimal which is greater than 500, the threshold value. It shows that node sends data when it is above the threshold, otherwise not. Now suppose someone needs to get the temperature readings. In order to perform this task the command run which is stated below will change the nodes mission from light to temperature at run time;

java net.tinyos.tools.BcastInject temp_sensing 500 500

The values are displayed on Cygwin window using the ‘Listen’ tool, as pictured below;

Page 29: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

29/60

Figure 8: Packet description including Temp reading

In the first row the data reading is 0x01F9 or 505 decimal which shows that data is sent only when it is above the threshold. Whenever it is required to stop WSN, it can be done to inject a command packet as instruction given below. java net.tinyos.tools.BcastInject stop

After this command the nodes will stop sensing and it is checked by the ‘Listen’ command that no data is received.

5.1.6 Summarization of Designed Set-up Protocol

In the designed setup protocol, only two sensors (Light and Temperature) are embedded at this stage. So, there is selection of only two sensors to perform the task but working on the same line, any other simple sensors can also be implemented. The parameters that can be passed wirelessly through PC are threshold value and sampling period. In order to start the network to perform a task, it is needed to select the required sensor and appropriate parameters. Until now there are three command packets that can be passed in order to control the network remotely are stated as: 1) photo_sensing <Threshold> <Sampling Interval> 2) temp_sensing <Threshold> <Sampling Interval> 3) stop

Page 30: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

30/60

The first two command packets are for starting the network that are injected to the network after the selection of sensor, threshold value and sampling interval. The third command packet is injected to stop the network. All these command packets are passed through cygwin window towards the network. For the evaluation of setup protocol, three motes are programmed and deployed in the monitoring area as shown in figure 9. Now suppose someone wants to perform the task using light sensor with threshold value ‘500’ and sampling interval ‘500’. After passing the command packet (photo_sensing 500 500), the nodes will start sampling the light values and the data is obtained on PC only when the threshold value condition near any node becomes true. Meanwhile if some one wish to execute task using temperature with different or same threshold value and sampling interval, the command packet (temp_sensing 500 300) is passed by putting the required parameters at run time. Now the network starts sensing using temperature sensor with new parameter values. When it is needed to stop the network from mission execution, the command packet (stop) is passed and in response, nodes stop sensing.

Figure 9: Experimental Setup for Designed Protocol

5.1.7 Experimental Results

This section deals with the results of the application developed in nesC described from 5.1.1 to 5.1.6 and the results are given through graphical representation using Matlab. The nodes are programmed and run first for light sensor sampling by passing the threshold value “500” and sampling interval ‘500’, and got the values showed in figure 7. The logical operator used in the programmed application is greater sign. The graph between light level and epoch is plotted in the Matlab figure below.

Page 31: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

31/60

Figure 10: Light level vs. Epoch

The graph shown above confirms about the event-detection that all the obtained values (squared) are above the threshold value. Epoch is the period of time between the start of each sample period, so it is also concluded that the event-detection time can be determined with respect to the application starting time by multiplying the epoch with the sampling period. Now nodes run for temperature sensor sampling by passing the threshold value ‘500’ and sampling interval ‘300’, and got the values showed in figure 8. The plot between Epoch and Temperature value is given below:

Figure 11: Temperature vs. Epoch

Page 32: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

32/60

Figure 11 also confirms the event detection that all values (squared) are above the given threshold. For the temperature variation of sensor node, artificial methods are used. It can be seen from the figure that no value is obtained before epoch 75, which shows that the temperature is below threshold at this point. On the other hand, values at and after epoch 75 are gradually increasing because the temperature is raised artificially.

Page 33: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

33/60

5.2 Experiments using TinyDB This phase demonstrates the use of TinyDB in a remote surveillance system. The scenario for a surveillance system is to count the total number of persons present at any time in a given room and this information can be viewed remotely via a PC. The main complication to achieve this goal is the use of the casual built in sensors (light, sounder, etc.) on MICA2 motes. The total room doors are optional, but if something is done from one door, it can be extended to include other room doors. We did some experimental work on this scenario and succeeded up to some extent but working on same lines and thought a complete applicable application can be developed. The system architecture for the experimental work is given below.

Figure 12: System architecture using TinyDB

The first step under this surveillance system scenario is to detect the person that enters or exits the door. Normally passive infrared (PIR) sensors are used to detect persons, which is an easy way and have already been done. Using the mote equipped with built-in sensors, on the other hand, is more complicated. To achieve this task, a photo sensor and a light-source scattering light on the mote are used. In this approach a sensor node is placed on one side of the door while the light source is on the other side as shown in the figure 13 below.

Page 34: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

34/60

In this situation, a mote that is programmed by using TinyDB which is taking light values by sending query continuously with a given sampling interval will reduce the light level when a person enters the room. The reduction occurs in that instance of time when the person is between mote and light source. If we came to know that lower threshold light value and the proper sampling interval we can detect the person entering the room. And if the threshold value is not proper the person will pass and will not be detected. The same thing would happen with the sampling period if it is long or short from the proper value. Now the next step is to find these two parameters.

To find these parameters, one mote connected to the programming board, which in turn is connected to the PC through UART. The mote is programmed with ‘TinyDBApp’ (opt/Tinyos-1.x/apps) and run on the TinyDBMain (opt/Tinyos-1.x/tools/java) Java application on cygwin window as; java net.tinyos.tinydb.TinyDBMain

This provides a graphical interface for distributing queries over motes and collecting data from them. After running this application a query window is opened and a query is posed as; SELECT nodeid, light FROM sensors SAMPLE PERIOD 2048ms

Figure 13: Experimental Scenario using TinyDB

Page 35: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

35/60

This query continuously collects the node id and the light data after 1024ms which can be visualized on the query window (For more details please look at appendix B). To find out the threshold value, the light level is noted down when the person is between mote and light, and this value is taken as threshold, which is 800. We set this threshold in the query as; SELECT nodeid, light FROM sensors WHERE light < 800 SAMPLE PERIOD 2048ms

After running this query we tested the threshold value by repeatedly passing a person many times between the mote and the light source. We get the light value and node id only when threshold occur. But frequently the value is missed due to the quite long sampling period which is not proper up till now. To find out the appropriate sampling period, the same experiment is done with different sampling periods and finally the correct sampling period noted: 1024ms. After getting the finalized threshold value and sampling interval, the test set-up is made and the query is injected for detecting the person; the corresponding results are pictured in figure 14 in the form of data and graph. This figure depicts three attributes; epoch time, nodeid and the light level that shows that the first person is detected at epoch 8, the second at 18, and the third at 34. These all light values correspond to when a single person passes between the node and the light source.

Figure 14: Single Person detection in room

But there is another problem that according to door width two persons side by side can enter easily and in this situation one value is obtained which will correspond to one person. To solve this problem the same experimental scenario is made and run the query

Page 36: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

36/60

without threshold and the value is checked when two persons are between the mote and the light source, and got that this time the value is almost half than the light value having one person between the mote and the light source. Now again inject the query under the same test scenario with proper threshold value (800) and sampling interval (1024ms) for the detection of two persons and the results obtained are pictured as;

Figure 15: Persons Detection in room

In figure 15, each row shows the detection of two persons. This is made clear through the fact that the obtained light values are almost half of the values obtained when a single person detected or passes. From these results it is concluded that with the query above we can detect the person and if a java application corresponding to this query is developed, we can set the condition that when light value will be -15% of the first threshold ‘800’ (upper window) it means that one person enters while a light value of +15% of the half of the threshold (lower window) means that two persons enter. This can also be done by the proper amendments in the TinyDBMain java application. The upper and lower windows will be changed according to the implementation area by accounting the noise and other communication impairments. Finally the question which has to be resolved is that when a person is detected under this scenario then how we came to know that the person enters or exits and corresponding to this decision there would be increment or decrement for counting. We have an idea to resolve this issue by mounting one more pair of mote and light source on the door with the first one. In this set-up, at entrance if node with id ‘1’ comes first than the node

Page 37: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

37/60

having id ‘2’ then by passing a person if we first get light value from node id 1 and then from mote 2, it means that a person enters while vice versa a person exits.

5.2.1 Uncertainty Problems

The experimental work carried out on developing this remote surveillance system is done only at initial stages but more work has to be done to accomplish it. There are some uncertainties problems that need to be handle when working on it.

• Suppose a person is going to enter or exit a room but suddenly the person changes his or her decision for returning back but he or she has crossed by one mote at the time of the decision, so when the person returns back he or she crosses by the same mote again and in this way two values corresponding to one mote will be obtained which corresponds to no increment or decrement.

• The other uncertainty comes when a person enters or exits more slowly or fast

with respect to the sampling interval. Because of the slow passing there is a chance to obtain more than one value against each mote crossing while for fast passing there is a chance that no value is obtained.

• There is another interesting but real fact that if a fat person passes across the

door, the light level can go as low as two persons have passed, and there will be an increment or decrement of two instead of one.

Parameters such as sampling interval and threshold value have to be found with respect to the implementation area and this system will face more problems in the areas where persons are entering and exiting not in a regular pattern.

Page 38: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

38/60

5.3 Comparison between nesC and TinyDB

Figure 16: Programming Models for Sensor Networks[17]

Low level programming models take a platform-Centric view [17] focused on abstracting the hardware and permitting a flexible control of nodes. In this class TinyOS with nesC is one of the earliest examples and has de facto been the standard software platform of sensor network programming. In this class an attractive approach is to run a virtual machine that offers an execution environment for scripts that are much smaller than the binary codes for TinyOS. Hence it is suitable for the condition where the codes on each node are to be dynamically reprogrammed after deployment via a wireless channel. For various purposes such as platform independence and isolation virtual machines are widely used in high end servers as well as consumer PCs. In sensor network programming the focus is on “reprogramability” that is the ability to insert new codes into each node dynamically after deployment. ASVM and Mate are stack oriented virtual machines implemented on top of TinyOS. High-level programming models take an application centric view and address how easily application logics can be programmed and provide flexibility in optimizing the system’s performance. More specifically, in this class they focus is on facilitating collaboration among sensors, a major group of sensor network applications and also one of the complicated challenges for sensor network programming. High level programming models are further divided into two categories: group level and network level. Network level models provide the sensor database approach which allows the user to query sensor data by declarative SQL-like languages TinyDB and Cougar etc. fall under this category. nesC is a specific language developed from C language for networked embedded systems which is featured by an event-driven execution engine, a flexible concurrency model, and component-oriented applications. It is a real-time language which has close link to the underlying hardware. The benefits of nesC are explored in TinyOS, such as concurrency,

Page 39: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

39/60

precise real-time performance and its flexibility [18] allowing the tuning of every parameter for special application needs, such as energy efficiency and power saving control. Encapsulation by modules provides a unified interface that frees programmers from being conscious about whether some functionality is implemented by hardware or software. On the other hand, as the level of abstraction is very low, it is often difficult to implement even simple programs. Additionally, rigorously event-based style and the exclusion of blocking operations is often the sources of complexity. NesC can manage radio transmission, memory allocation, processing scheduling, sensor inputs/outputs, etc. It is well-suited for different sensor platforms such as Mica2, Telos and Rene2. TinyDB has the major strong point in its flexibility supporting any type of sensor board and sensors, provided that the requisite TinyOS modules can be built up. The latest versions of TinyDB support a different routing protocol which may improve the capability to support a multi-hop networking. TinyDB would be an adequate tool for agriculture-sensing experiments where the programming is not an absolute limit; nesC would be the better option where complex problem sensor networks are to be deployed. In challenging environments where RF transmission is poor, there appear to be limits on the use of TinyDB. The system also requires careful attention to ensure that it is not overloaded by data. An important issue is that the database paradigm that is used by TinyDB is not well suited to multiple different sensors on multiple motes and with different sampling intervals. TinyDB has one major limitation, namely that queries involving complex data communication patterns are not allowed because the data streams flow towards the sink node along the edges of the routing tree. TinyDB has polling based mechanisms for data acquisition and it also supports events as a method for initiating data collection. In TinyDB [16], events are generated either by a lower-level part of the operating system (interrupt lines being raised on the processor) or by another query (sensor readings going above or below some threshold). An event-based query triggered by a second query that polls for some condition (threshold value) to be true is known as polling driven event-based query. This is the same query that is posed for experimental work in section 5.2. With this type of query, there is continual polling mechanism between the base station and the sensor nodes to verify the sensor reading condition to be true, and the data is obtained on PC only when the threshold regarding sensor value occurs. In case of nesC, there is no continual polling mechanism for event based applications while in contrast a programmed application run on the nodes that samples sensor data continuously to check the condition or threshold value to be true for sending the data as explained in section 5.1.3.

Page 40: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

40/60

6 Conclusions and Future Work

6.1 Conclusions The thesis intention is to develop a remote surveillance and measurement application experiments by using TinyOS/nesC and TinyDB platforms, and to make a comparative study between them. During the first phase of the thesis, a surveillance application is built-up using TinyOS/nesC which measures and detects events in the environment (light, temperature etc.). This is done by using the mica2 mote sensors, and a setup protocol is designed to make WSN remotely controllable in order to achieve the functionalities; Event-detection (section 5.1.3) and Adaptive to the mission (section 5.1.5). In the second phase of the thesis, an experimental work is carried out on developing a remote surveillance system by using TinyDB that detects (using light sensor) and count the total number of persons present in a room (section 5.2). The same experiment can also be done by using our developed application by passing the command packet ‘photo_sensing’ with required arguments (sampling interval and threshold value) using less than (<) logical operator. By comparing our application based setup protocol (using nesC/TinyOS platform) with TinyDB with regard to Event-Detection, we see that TinyDB supports events as a method for initiating data collection but up to a limited extent. It does not provide a fully distributed event propagation system in its current implementation however, events are only signaled on the local node and the queries started in response to a local event may be disseminated to other nodes. While using nesC/TinyOS platform, application specific measurements can be done by applying more than one threshold values (levels) and logical operators. And also variable data packet lengths having one value or more than one value depending upon the network status can also be achieved. Another aspect is that we can also have distributive actuating mechanism correspondent to every sensor node or group of sensor nodes according to the application requirement. The capability of WSN to perform more than one mission and configuring wirelessly any task functionality in order to adapt the network to perform that task is known as mission adaptiveness. Through TinyDB, different missions can be performed by using built-in sensors on motes or by embedding the external sensors, by injecting queries wirelessly towards the network with respect to need. By comparing TinyDB with our setup protocol with respect to mission adaptiveness, it is clear that the same tasks can also be performed through TinyDB but the missions in which nodes can communicate with one another for a collective decisive result is not possible using TinyDB. From the comparison of our developed application with TinyDB as regard to power utilization, we see that TinyDB uses polling based mechanism to measure and detect the environmental events which are an energy consuming method. Especially this factor becomes more serious in case of long running applications in which at each sampling period the query is forwarded from the base station towards the network and in response to that query the data is obtained. While in case of nesC/TinyOS platform, once program

Page 41: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

41/60

is installed on each node, then sampling occurs at each node and the data is obtained according to the parameters settings and mission configurations.

6.2 Future Work

• To apply this application by deploying sensor nodes for some useful mission in real agriculture environments for demonstration.

• With the new release of TinyOS 2.0, the same approach used in this thesis can be

applied to make the application and then to implement it.

• To make a graphical user interface for front end.

• To expand the operating region, this application can be extended to operate through Multihop communication. The command packets will pass to the sensor nodes (which are away from communicating area) via a sink node (within the communicating area) and in turn the data from these nodes will come to the base station through sink node by mutihop communication.

Page 42: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

42/60

7 References [1] Tinyos Tutorials www.tinyos.net/tinyos-1.x/doc/tutorial [accessed 18 April 2007] [2] Crossbow Technology Inc. http://www.xbow.com/ [accessed 18 April 2007] [3] D. Gay, P. Levis, R. von Behren, and et al., “The nesC language: A holistic approach to networked embedded systems,” in Proceedings of 5th Annual ACM/IEEE International Conference on Mobile Computing and Networking MobiCom ’99) vol.35, May 2003. [4] Philip Levis, “TinyOS Programming” June 28, 2006, http://www.csl.stanford.edu/~pal/pubs/tinyos-programming.pdf [accessed 18 April 2007] [5] Philip Levis and David Culler “TinyOS An operating System for sensor network” http://www.sensorsmag.com/sensors/article/articleDetail.jsp?id=324975 [accessed 18 April 2007] [6] Tatiana Bokareva, Wen Hu “Wireless Sensor Networks for Battlefield Surveillance” The University of New South Wales, Sydney, Australia, October 2006, PP 1-8 [7] Hans Weger Dave Roelant Rodrigue Ade “Remote Surveillance of Facilities Awarding D&D” Florida International University U.S, January 2001, PP 1-27 [8] G.L. Foresti, B. Pani “A Multisensor System for Remote Surveillance for Remote Surveillance System for Remote Surveillance of a Motorway Overpass” University of Udine, Italy, PP 1-8 [9] I.F. Akyildiz, W. Su, Y. Sankarasubramaniam, and E. Cayirci, “Wireless Sensor Networks: A Survey”, Computer Networks Journal, Vol. 38, NO.4, p.p. 393- 422, March 2002. [10] Lucas Wanner, the EPOS System Supporting Wireless Sensor Networks Applications, BSC Thesis at Federal University of Santa Cafrina, February 2002 [11] Muhammad Ilyas and Imad Mahgoub, Hand book of Sensor Networks: Compact Wireless and Wired Sensing System. Boca Raton, FI, USA: CRC Press Inc. 2004. [12] E. H. Callaway Jr., Wireless Sensor Networks: Architectures and Protocols, 1st ed. New York: CRC Press LLC., 2004, pp. 1–20. [13] Christer Englund Henrik Wallin “RFID in Wireless Sensor Network”, Master thesis At Chalmers University Sweden, April 2004

Page 43: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

43/60

[14] Mark Weiss “Tiny OS an Operating System for Wireless Sensor Networks” Department of Computer Science and Engineering University of Nebraska- Lincoln [15] P. Levis, N. Lee, “TOSSIM: A Simulator for TinyOS Networks”, Computer Science Division, University of California Berkeley, California, 17 September 2003. [16] Sam Madden, Joe Hellerstein, “TinyDB: In-Network Query Processing in TinyOS” September, 2003, PP 1-46 [17] Ryo Sugihara, Rajesh K. Gupta “Programming Models for Sensor Networks: A Survey”, University of California, San Diego, June 2007 PP 1-27 [18] Liangjie He, Cheng Li “Implementation and Emulation of Distributed Clustering Protocols for Wireless Sensor Networks” J. P. e. a. P. Levis, S. Madden, “TinyOS: An operating system for wireless Sensor networks,” in Ambient Intelligence, New York. 2004.

Page 44: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

44/60

Appendix A: A Quick Introduction to Hardware (nesC) The appendix A demonstrates the description of hardware (mica2 motes) and software (nesC) that are appropriate for activity-driven teaching of sensor networks to those who are unfamiliar and new to work on mica2 motes. These materials are derived from TinyOS tutorial and crossbow technology, papers. We depict our experiences and problems during work. We installed TinyOS version 1.x, the installation contains a Java environment, TinyOS, Cygwin, and platform support for Mica2 and others motes.

• Instruction for motes

o Do not put a sensor node with batteries in it to a powered programming board. This can cause them to malfunction, and potentially no longer work.

• To check that the TinyOS 1.x has been installed and working properly,

1) Firstly, give path on the cygwin window:

� C:\tinyos\cygwin\opt\tinyos-1.x\tools\scripts 2) Then run the command given below:

� Toscheck 3) If the message listed below comes after running the ‘Toscheck’ command, it means that TinyOS is installed and working properly.

� Toscheck completed without error

• To configure the operating frequency of MICA2 motes that has been used. 1) To compile and upload the built-in or designed application, some files are necessary to configure located in the directory path given below:

� C:\tinyos\cygwin\opt\tinyos-1.x\apps 2) A file named ‘Makelocal’ is present at the given location stated above. Any one of the lines stated below has to write in this file with respect to the motes operating frequency:

PFLAGS += -DCC1K_DEF_FREQ=916700000 OR # Makelocal File

PFLAGS += -I%T/../beta/MyBetaCode PFLAGS += -DCC1K_DEF_FREQ=916700000

Page 45: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

45/60

OR PFLAGS+=-DDC1k_DEFAULT_FREQ=CC1k_915_998_MHZ CFLAGS=-DCC1K_DEFAULT_FREQ=1 3) Through some source, we came to know that this file can also be configured with line stated below, but it didn’t work PFLAGS+=-DDC1k_DEFAULT_FREQ=CC1k_915_998_MHZ 4) To run any application, the ‘Makelocal’ file would always be present beside the application folder.

• The file named ‘Makefile’ present the application folder must be configured with the following lines explained below:

COMPONENT=Blink include ../Makerules SENSORBOARD = micasb

o Where ‘Blink’ is the name of application and it will be changed according to the

application name while ‘micasb’ is the type of sensor board that is used.

• Instructions to run a built-in application over UART. 1) First do the following hardware configurations:

o Connect the programming board (MIB510) to power and serial port of PC. o Attach sensor node onto the programming board (MIB510). o Attach the sensor board if a sensor configured application is used.

2) To compile and upload the application, follow the given instructions:

o First, give the directory path on the cygwin window listed below:

� opt\tinyos-1.x\apps

o Now, to run an application (code) the command on cygwin window can be:

� make mica2 install mib510, /dev/ttyS0 OR

o MIB510=COM1 make mica2 install.nodeid

� MIB510=COM1 make mica2 install.1

Page 46: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

46/60

• Instructions to run the developed application ‘MyApp.nc’ over RF: 1) First do the following hardware configurations as well as compile and upload the application:

o Attach the programming board (MIB510) to power and serial port of PC o Connect Sensor node (Mica2) and a sensor board onto the programming board

(Attach sensor with programming board sensors go below) o Compile and upload the ‘MyApp.nc’ code to the node as mentioned above. o Remove the node and sensor board from the programming board and put them

together and install batteries. [Take node and sensor board off the programming board and assemble them, then]

o Put another sensor node on the programming board, compile and upload the TOSBase code to the second node (with nodeid 0). Where ‘TOSBase’ is an application that simply forwards packets between the radio and the UART (in both directions).

2) Open a new cygwin window to set the Baudrate (Mica2).

o Give the path to set the baud rate:

� Opt/tinyos-1.x/tools/java

o Run the Command given below:

� export MOTECOM=serial@COM1:57600 (for mica2 motes) 3) After giving baudrate, to run the ‘Listen.java’ tool, run the command on Cygwin window given below:

� java net.tinyos.tools.Listen

o In our Scenario, we will not get any value because first we have to start the network. So quit the listen command by pressing ‘Ctrl+c’.

3) To start the network, it is required to run the SerialForwarder java application.

o Open a new Cygwin window and start the SerialForwader application by running the command given below:

o Java net.tinyos.sf.SerialForwarder –comm serial@ (serialport) :( baudrate) � Java net.tinyos.sf.SerialForwarder-comm serial@COM1:57600

o Do not terminate the SerialForwarder program because it is essential for the next

step.

Page 47: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

47/60

4) Now open a new Cygwin window to pass the command packets through ‘BcastInject.java’ application:

o To start the network, pass the command packet by selecting appropriate sensor (photo or temperature), threshold value and sampling interval, towards the sensor node by running the following command:

o java net.tinyos.tools.BcastInject start_sensing <threshold>

<interval>

o In case of light sensor, run the command:

� java net.tinyos.tools.BcastInject photo_sensing 600 500

o While in case of temperature sensor, run the command:

� java net.tinyos.tools.BcastInject temp_sensing 600 500

5) After starting the sensor node by selecting any sensor (photo or temp), to check the readings, first quit from the ‘SerialForwarder’ and go again to the Cygwin window where ‘Listen’ tool already has been run.

o Run the ‘Listen’ application again for viewing the readings by running the same command as before:

� java net.tinyos.tools.Listen

6) If we want to stop the network, quit from the ‘Listen’ tool and run the ‘SerialForwarder’ tool again.

o Run the following command on that Cygwin window that is opened to pass command packets;

� java net.tinyos.tools.BcastInject stop

1) To pass the command packets, the ‘Listen’ tool will be quitted while ‘SerialForwarder’ tool will be run. 2) To run the ‘Listen’ tool, ‘SerialForwarder’ tool will be quitted.

Page 48: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

48/60

The Designed Application Code

• The designed Application ‘MyApp’ folder contains the following files:

1) Makefile 2) MyApp.nc 3) MyAppM.nc 4) Sensing.nc 5) SimpleCmd.nc 6) OscopeMsg.h 7) SimpleCmdMsg.h

o Configuration of ‘Makefile’ COMPONENT=MyApp #COMPONENT=Bcast SENSORBOARD=micasb include ../Makerules

o Configuration File ‘MyApp.nc’

includes OscopeMsg; includes SimpleCmdMsg; /** * This configuration describes the MyApp applicati on, * a simple TinyOS app that periodically takes sens or reading * (light or temperature) and sends the reading ove r the UART * only when a given threhold occurs. */ configuration MyApp { provides interface ProcessCmd; provides interface Sensing; } implementation { components Main, MyAppM, GenericComm as Comm, Pot C, TimerC, LedsC, Photo as PhotoSensor, Temp as TempSensor; Main.StdControl -> MyAppM; Main.StdControl -> TimerC; MyAppM.phototimer -> TimerC.Timer[unique("Timer") ]; MyAppM.temptimer -> TimerC.Timer[unique("Timer")] ; MyAppM.Pot -> PotC; MyAppM.Leds -> LedsC; MyAppM.SensorControl -> PhotoSensor; MyAppM.SensorControl -> TempSensor; MyAppM.photoADC -> PhotoSensor; MyAppM.tempADC -> TempSensor; MyAppM.CommControl -> Comm; ProcessCmd = MyAppM.ProcessCmd; Sensing = MyAppM.Sensing; MyAppM.DataMsg -> Comm.SendMsg[AM_OSCOPEMSG];

Page 49: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

49/60

MyAppM.ResetCounterMsg -> Comm.ReceiveMsg[AM_OSCO PERESETMSG]; MyAppM.ReceiveCmdMsg -> Comm.ReceiveMsg[AM_SIMPLE CMDMSG]; }

o Module File ‘MyAppM.nc’

includes OscopeMsg; includes SimpleCmdMsg; /** * This module implements the MyAppM component, whi ch periodically * takes sensor reading (Light or temperature) and sends the * reading only when a given threshold occurs over the UART. * The Yellow LED is toggled whenever a data packet is sent. */ module MyAppM { provides { interface StdControl; interface ProcessCmd; interface Sensing; } uses { interface Timer as phototimer; interface Timer as temptimer; interface Pot; interface Leds; interface ADC as photoADC; interface ADC as tempADC; interface StdControl as SensorControl; interface StdControl as CommControl; interface SendMsg as DataMsg; interface ReceiveMsg as ReceiveCmdMsg; interface ReceiveMsg as ResetCounterMsg; } } implementation { uint8_t packetReadingNumber; uint16_t readingNumber; uint16_t epochNumber; TOS_Msg msg[2]; uint8_t currentMsg; TOS_MsgPtr msgg; TOS_Msg buf; short threshold; short number; /** * Used to initialize this component. */ command result_t StdControl.init() { msgg = &buf; call Leds.init(); call Leds.yellowOff(); call Leds.redOff(); cal l Leds.greenOff();

Page 50: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

50/60

//turn on the sensors so that they can be read. call SensorControl.init(); call CommControl.init(); atomic { currentMsg = 0; packetReadingNumber = 0; readingNumber = 0; epochNumber = 0; } dbg(DBG_BOOT, "OSCOPE initialized\n"); return SUCCESS; } /** * Starts the SensorControl and CommControl compo nents. * @return Always returns SUCCESS. */ command result_t StdControl.start() { call SensorControl.start(); call CommControl.start(); return SUCCESS; } /** * Stops the SensorControl and CommControl compon ents. * @return Always returns SUCCESS. */ command result_t StdControl.stop() { call SensorControl.stop(); call CommControl.stop(); return SUCCESS; } /** * This task evaluates a command and execute it i f it is a supported * command. The protocol for the command interpre ter is that * it operates on the message and returns a (pote ntially modified) * message to the calling layer, as well a status word for whether * the message should be futher processed. * @return Return: None */ task void cmdInterpret() { struct SimpleCmdMsg * cmd = (struct SimpleCmdMs g *) msgg->data; // do local packet modifications: update the ho p count and packet source cmd->hop_count++; cmd->source = TOS_LOCAL_ADDRESS; // Execute the command switch (cmd->action) { case PHOTO_SENSING: call Sensing.start(cmd->args.ss_args.nsamples , cmd->args.ss_args.interval); number = 1; break;

Page 51: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

51/60

case TEMP_SENSING: call Sensing.start(cmd->args.ss_args.nsamples , cmd->args.ss_args.interval); number = 2; break; case STOP: call phototimer.stop(); call temptimer.stop(); signal Sensing.done(); break; } signal ProcessCmd.done(msg, SUCCESS); } /** * This command belongs to the <code>Sensing</cod e> interface. * It starts the timer to generate periodic event s. * @return Always returns <code>SUCCESS</code> */ command result_t Sensing.start(int samples, int i nterval_ms) { threshold = samples; if (number==1){ //call Leds.yellowOff(); call phototimer.start(TIMER_REPEAT, interval_ms ); return SUCCESS; } else call temptimer.start(TIMER_REPEAT, interval_ms) ; return SUCCESS; } /** * Default event handler to the <code>Sensing.don e</code> event. * @return Always returns <code>SUCCESS</code> */ default event result_t Sensing.done() { return SUCCESS; } task void dataTask() { struct OscopeMsg *pack; atomic { pack = (struct OscopeMsg *)msg[currentMsg].da ta; packetReadingNumber = 0; pack->lastSampleNumber = readingNumber; } pack->channel = epochNumber; pack->sourceMoteID = TOS_LOCAL_ADDRESS; /** Try to send the packet. Note that this will return * failure immediately if the packet could not be queued for * transmission. */ if (call DataMsg.send(TOS_UART_ADDR, sizeof(str uct OscopeMsg), &msg[currentMsg]))

Page 52: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

52/60

{ atomic { currentMsg ^= 0x1; } call Leds.yellowToggle(); } } /** * Signalled when data is ready from the photoADC . Stuffs the sensor * reading into the current packet, and sends off the packet when * given threashold occurs. * @return Always returns SUCCESS. */ async event result_t photoADC.dataReady(uint16_t data) { if (data > threshold){ struct OscopeMsg *pack; atomic { pack = (struct OscopeMsg *)msg[currentMsg].da ta; pack->data[packetReadingNumber++] = data; readingNumber++; dbg(DBG_USR1, "data_event\n"); if (packetReadingNumber == BUFFER_SIZE) { post dataTask(); } } } } /** * Signalled when data is ready from the tempADC. Stuffs the sensor * reading into the current packet, and sends off the packet when * given threashold occurs. * @return Always returns SUCCESS. */ async event result_t tempADC.dataReady(uint16_t dat a) { if (data > threshold){ struct OscopeMsg *pack; atomic { pack = (struct OscopeMsg *)msg[currentMsg].da ta; pack->data[packetReadingNumber++] = data; readingNumber++; dbg(DBG_USR1, "data_event\n"); if (packetReadingNumber == BUFFER_SIZE) { post dataTask(); } } } } /** * Signalled when the previous packet has been se nt. * @return Always returns SUCCESS. */ event result_t DataMsg.sendDone(TOS_MsgPtr sent, result_t success) {

Page 53: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

53/60

return SUCCESS; } /** * Signalled when the clock ticks. * @return The result of calling photoADC.getData (). */ event result_t phototimer.fired() { epochNumber++; return call photoADC.getData(); } /** * Signalled when the clock ticks. * @return The result of calling tempADC.getData( ). */ event result_t temptimer.fired() { epochNumber++; return call tempADC.getData(); } /** * Signalled when the reset message counter AM is received. * @return The free TOS_MsgPtr. */ event TOS_MsgPtr ResetCounterMsg.receive(TOS_MsgP tr m) { atomic { readingNumber = 0; } return m; } /** * Posts the cmdInterpret() task to handle the re cieved command. * @return Always returns <code>SUCCESS</code> */ command result_t ProcessCmd.execute(TOS_MsgPtr pm sg) { msgg = pmsg; post cmdInterpret(); return SUCCESS; } /** * Called upon message reception and invokes the * ProcessCmd.execute() command. * @return Returns a pointer to a TOS_Msg buffer */ event TOS_MsgPtr ReceiveCmdMsg.receive(TOS_MsgPtr pmsg){ TOS_MsgPtr ret = msgg; result_t retval; //call Leds.greenToggle(); retval = call ProcessCmd.execute(pmsg) ; if (retval==SUCCESS) { return ret; } else { return pmsg; }

Page 54: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

54/60

} /** * Called upon completion of command execution. * @return Always returns <code>SUCCESS</code> */ default event result_t ProcessCmd.done(TOS_MsgPtr pmsg, result_t status) { return status; } }

o ‘Sensing.nc’ Interface

/** * Defines an interface for a component that senses data at a certain * interval and scale. */ interface Sensing { /** * Start sensing data. * @param nsamples used as for threshold value. * @param interval_ms The interval (in msec) at w hich * to gather samples. */ command result_t start(int nsamples, int interval _ms); /** * Signalled when sensing has completed. */ event result_t done(); }

o ProcessCmd.nc Interface

includes AM; /** * This interface is provided by components that ca n handle a * command message arriving as a TOS_MsgPtr. */ interface ProcessCmd { /** * This command extracts the command from the me ssage 'pmsg' and * executes the command. * @return Command execution result. */ command result_t execute(TOS_MsgPtr pmsg); /** * Indicate that the command contained in 'pmsg' has * finished executing. * @param status The status of the command comple tion.

Page 55: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

55/60

* @return Always returns SUCCESS. */ event result_t done(TOS_MsgPtr pmsg, result_t sta tus); }

o OscopeMsg.h header file

enum { BUFFER_SIZE = 1 }; struct OscopeMsg { uint16_t sourceMoteID; uint16_t lastSampleNumber; uint16_t channel; uint16_t data[BUFFER_SIZE]; }; struct OscopeResetMsg { /* Empty payload! */ }; enum { AM_OSCOPEMSG = 1, AM_OSCOPERESETMSG = 32 };

o SimpleCmdMsg.h header file /* * This header file defines the AM_SIMPLECMDMSG mes sage * type for the MyApp application. */ enum { AM_SIMPLECMDMSG = 8, }; enum { STOP = 2, PHOTO_SENSING = 4, LIGHT_SENSING = 5, }; typedef struct { int nsamples; uint32_t interval; } start_sense_args; // SimpleCmd message structure typedef struct SimpleCmdMsg { int8_t seqno; int8_t action;

Page 56: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

56/60

uint16_t source; uint8_t hop_count; union { start_sense_args ss_args; read_log_args rl_args; uint8_t untyped_args[0]; } args; } SimpleCmdMsg;

o BcastInject.java file package net.tinyos.tools; import net.tinyos.util.*; import java.io.*; import java.util.Properties; import net.tinyos.message.*; public class BcastInject implements MessageListener { static Properties p = new Properties(); public static final byte STOP = 2; public static final byte TEMP_SENSING = 4; public static final byte PHOTO_SENSING = 5; public static final short TOS_BCAST_ADDR = (sho rt) 0xffff; public static void usage() { System.err.println("Usage: java net.tinyos.tools.B castInject"+ " <command> [arguments]"); System.err.println("\twhere <command> and [argumen ts] can be one of the following:"); System.err.println("\t\tstop"); System.err.println("\t\temp_sensing [nsamples inte rval_ms]"); System.err.println("\t\tphoto_sensing [nsamples in terval_ms]"); } public static void tempSensingUsage() { System.err.println("Usage: java net.tinyos.tools.B castInject" + " temp_sensing [num_samples interval_ms]"); } public static void photoSensingUsage() { System.err.println("Usage: java net.tinyos.tools.B castInject" + " photo_sensing [num_samples interval_ms]") ; } public static byte restoreSequenceNo() { try { FileInputStream fis = new FileInputStream("bcast.properties"); p.load(fis); byte i = (byte)Integer.parseInt(p.getProperty( "sequenceNo", "1")); fis.close(); return i;

Page 57: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

57/60

} catch (IOException e) { p.setProperty("sequenceNo", "1"); return 1; } } public static void saveSequenceNo(int i) { try { FileOutputStream fos = new FileOutputStream("bcast.properties"); p.setProperty("sequenceNo", Integer.toString(i )); p.store(fos, "#Properties for BcastInject\n"); } catch (IOException e) { System.err.println("Exception while saving seq uence number" + e); e.printStackTrace(); } } public static void main(String[] argv) throws I OException { String cmd; byte sequenceNo = 0; if (argv.length < 1) { usage(); System.exit(-1); } cmd = argv[0]; if (cmd.equals("photo_sensing") && argv.length != 3) { photoSensingUsage(); System.exit(-1); } else if (cmd.equals("temp_sensing") && argv.leng th != 3) { tempSensingUsage(); System.exit(-1); } SimpleCmdMsg packet = new SimpleCmdMsg(); sequenceNo = restoreSequenceNo(); packet.set_seqno(sequenceNo); packet.set_hop_count((short)0); packet.set_source(0); if (cmd.equals("stop")) { packet.set_action(STOP); } else if (cmd.equals("temp_sensing")) { packet.set_action(TEMP_SENSING); short nsamples = (short)Integer.parseInt(argv[ 1]); long interval_ms = (long)Integer.parseInt(argv [2]); packet.set_args_ss_args_nsamples(nsamples); packet.set_args_ss_args_interval(interval_ms); } else if (cmd.equals("photo_sensing")) { packet.set_action(PHOTO_SENSING); short nsamples = (short)Integer.parseInt(argv[ 1]); long interval_ms = (long)Integer.parseInt(argv [2]); packet.set_args_ss_args_nsamples(nsamples);

Page 58: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

58/60

packet.set_args_ss_args_interval(interval_ms); } else { usage(); System.exit(-1); } try { System.err.print("Sending payload: "); for (int i = 0; i < packet.dataLength(); i++) { System.err.print(Integer.toHexString(packet.dataG et()[i] & 0xff)+ " "); } System.err.println(); MoteIF mote = new MoteIF(PrintStreamMessenger. err); mote.send(TOS_BCAST_ADDR, packet); saveSequenceNo(sequenceNo+1); System.exit(0); } catch(Exception e) { e.printStackTrace(); } }

Page 59: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

59/60

Appendix B: A Quick Introduction to TinyDB

• To configure the file named ‘Makefile’ present the present in the TinyDBApp folder, the path is given below:

� C:\tinyos\cygwin\opt\tinyos-1.x\apps\TinyDBApp

o Configure the ‘Makefile’ by putting the lines stated below:

PFLAGS+=-DDC1k_DEFAULT_FREQ=CC1k_915_998_MHZ CFLAGS=-DCC1K_DEFAULT_FREQ=1

OR

PFLAGS += -DCC1K_DEF_FREQ=916700000

• Set the value of CLASSPATH for the output on Cygwin window as listed below:

� exportPATH="$PATH;c:/tinyos/cygwin/opt/tinyos1.x/tools/java/jre1.6.0_03/bin;"

o This path is only one time when you are going to run ‘TinyDB’.

• To compile and upload the ‘TinyDBApp.nc’ application, follow the given instructions:

o Make the hardware configuration as listed in the Appendix A.

o Now give the directory path on the cygwin window listed below:

� opt\tinyos-1.x\apps\TinyDBApp

o To run the application, the command on cygwin window can be:

o MIB510=COM1 make mica2 install.nodeid

� MIB510=COM1 make mica2 install.0 • To run the ‘TinyDBMain.java’ tool for generating the queries:

o A Graphical User Interface (GUI) will be displayed on the PC by running the following command on Cygwin window:

� java net.tinyos.tools.TinyDBMain

Page 60: Master’s Thesis in Electrical Engineering Muhammad Rashid ...hh.diva-portal.org/smash/get/diva2:239414/FULLTEXT01.pdf1/60 Technical report, IDE0836, January 2008 Remote Surveillance

60/60

o Through the GUI, we can pose the queries according to the need. The query in the experimental work (section 5.2) is also generated through the GUI and the results are also viewed on the GUI.