Testing of Sensor Observation Services: A Performance Evaluation

22
Testing of Sensor Observation Service Implementations A Performance Evaluation Ebrahim Poorazizi, Steve Liang, Andrew Hunter {mepooraz, steve.liang, ahunter}@ucalgary.ca Department of Geomatics Engineering, University of Calgary

description

M. E. Poorazizi, A. J. S. Hunter and S. H. L. Liang, Testing of Sensor Observation Services: A Performance Evaluation, In Proceedings of the First ACM SIGSPATIAL International Workshop on Sensor Web Enablement 2012 (SWE2012), Redondo Beach, CA, US, 6-9 November 2012.

Transcript of Testing of Sensor Observation Services: A Performance Evaluation

Page 1: Testing of Sensor Observation Services: A Performance Evaluation

Testing of

Sensor Observation Service Implementations

A Performance EvaluationEbrahim Poorazizi, Steve Liang, Andrew Hunter{mepooraz, steve.liang, ahunter}@ucalgary.caDepartment of Geomatics Engineering, University of Calgary

Page 2: Testing of Sensor Observation Services: A Performance Evaluation

Outline

Introduction

What is SOS

SOS Packages Specifications

Test Scenarios

Data Simulation

Test Environment

Experimental Results

Conclusions

Page 3: Testing of Sensor Observation Services: A Performance Evaluation

A Sensor Network is a computer accessible network of many, spatially distributed devices using sensors to monitor conditions at different locations, such as temperature, sound, vibration, pressure, motion or pollutants.

A Sensor Web refers to web accessible sensor networks and archived sensor data that can be discovered and accessed using standard protocols and APIs.

Introduction

Page 4: Testing of Sensor Observation Services: A Performance Evaluation

OGC Sensor Web Enablement Framework:

The models, encodings, and services of the SWE architecture enable implementation of interoperable and scalable service-oriented networks of heterogeneous sensor systems and client applications.

Introduction

SWE

Discovery of Sensors

Access to Sensors

Retrieval of Sensors Obs.

Tasking Sensors

Subscription to and

Publish of Alerts

Page 5: Testing of Sensor Observation Services: A Performance Evaluation

Introduction

SWE Data Models and Service Interfaces, created by tagxedo.com

Page 6: Testing of Sensor Observation Services: A Performance Evaluation

Client SOS

GetObservation

Observations

(O&M)

DescribeSensor

Sensor Description

(SensorML)

GetCapabilities

Capabilities

What is SOS

Standard web service interface for requesting, filtering, and retrieving observations and sensor system information

SOS 1.0.0 mandatory operations

Page 7: Testing of Sensor Observation Services: A Performance Evaluation

52North SOS Deegree SOS MapServer SOS

Development Platform Java Java C++License Open Source Open Source Open Source

SOS 1.0.0 – core Supported Supported Supported

SOS 1.0.0 – optional GetFeatureOfInterestGetResultRegisterSensorInsertObservation

GetFeatureOfInterest -

Database PostgreSQL/PostGIS PostgreSQL/PostGISMS SQL ServerMySQLArcSDEOracleESRI Geodatabase

PostgreSQL/PostGISMS SQL ServerMySQLOracle

Service Servlet Servlet CGI

DCP Request GET/POST GET/POST GET

52North SOS: A 52°North Initiative (Geospatial Open Source Software GmbH)Deegree SOS: A project for Open Source Geospatial FoundationMapServer SOS: A project for Open Source Geospatial Foundation

SOS Packages Specifications

Page 8: Testing of Sensor Observation Services: A Performance Evaluation

Test Scenarios

Postal codes as sensors

Test parameters:

BBOX (1 ha, 47.8 ha, 264.7 ha, and 759.6 ha)

Time (1 hour, 1 day, 1 week, 1 month)

BBOX-Time (16 levels)

Number of Sensors (1, 5, 20, 60, 101)

BBOX #1 BBOX #2 BBOX #3 BBOX #4

Page 9: Testing of Sensor Observation Services: A Performance Evaluation

Data Simulation

101 postal codes (of Calgary) were

randomly selected

Time period: 15 min in 1 month for

each sensor

Total number of observations:

4*24*30*101 = 290880

Database size:

188MB for PostgreSQL (52North SOS) 46MB for PostgreSQL (Deegree SOS) 39MB for PostgreSQL (MapServer SOS)

Page 10: Testing of Sensor Observation Services: A Performance Evaluation

Test Environment

Server specifications:

Hardware Acer Aspire 3970 (Host) VMware (VM)

CPU Intel Core i7 3.4 GHz 4 Cores of 8

RAM 8 GB 2 GB

HDD 1.5 TB 20 GB

OS Windows 7 Home Premium (64-bit) Ubuntu 11.10 (64-bit)

Each SOS package was installed on a separate VM

Client specifications:

Hardware Dell OptiPlex 990

CPU Intel Core i5 3.1 GHz

RAM 8 GB

HDD 500 GB

OS Windows 7 Professional(64-bit)

Page 11: Testing of Sensor Observation Services: A Performance Evaluation

Test Environment

All tests have been done from a remote client

Apache JMeter was used to test

Average of 30 requests for each test case were measured

Server @ GeoSensor Web LabClient @ Trailer H

WWW

Page 12: Testing of Sensor Observation Services: A Performance Evaluation

Experimental ResultsGetObservation (No. of Sensors)

1 Sensor 5 Sensors 20 Sensors 60 Sensors 101 Sensors

52North SOS 131.0 471.3 1,803.8 4,834.3 12,590.1

Deegree SOS 105.1 294.1 995.3 2,452.7 3,448.0

MapServer SOS 323.2 1,012.9 3,553.3 10,171.2 16,623.0

Page 13: Testing of Sensor Observation Services: A Performance Evaluation

Experimental ResultsGetObservation (No. of Sensors)

1 Sensor 5 Sensors 20 Sensors 60 Sensors 101 Sensors

52North SOS 104.5 518.5 2,071.0 6,210.8 10,454.1

Deegree SOS 94.5 469.7 1,876.9 5,629.2 9,475.3

MapServer SOS 69.9 344.1 1,372.3 4,114.2 6,924.7

Page 14: Testing of Sensor Observation Services: A Performance Evaluation

Experimental ResultsGetObservation (Time Series)

1 Hour 1 Day 1 Week 1 Month

52North SOS 213.1 482.5 2,230.9 11,444.3

Deegree SOS 126.7 257.2 1,023.8 2,897.7

MapServer SOS 270.6 444.9 1,983.5 16,032.8

Page 15: Testing of Sensor Observation Services: A Performance Evaluation

Experimental ResultsGetObservation (Time Series)

1 Hour 1 Day 1 Week 1 Month

52North SOS 293.3 618.5 2,653.5 10,369.3

Deegree SOS 165.6 463.5 2,328.1 9,475.3

MapServer SOS 168.1 384.3 1,737.5 6,924.7

Page 16: Testing of Sensor Observation Services: A Performance Evaluation

Experimental ResultsGetObservation (BBOX)

Level 1 Level 2 Level 3 Level 4

52North SOS 191.1 2,024.7 3,271.8 4,459.1

Deegree SOS 178.8 713.3 954.6 1,355.9

MapServer SOS 407.2 4,098.1 4,819.7 8,706.8

Page 17: Testing of Sensor Observation Services: A Performance Evaluation

Experimental ResultsGetObservation (BBOX)

Level 1 Level 2 Level 3 Level 4

52North SOS 180.4 2,422.9 4,127.1 5,713.9

Deegree SOS 191.8 2,280.1 3,432.5 4,993.7

MapServer SOS 111.0 1,692.3 2,016.6 3,730.5

Page 18: Testing of Sensor Observation Services: A Performance Evaluation

Experimental ResultsGetObservation (Time-BBOX)

Page 19: Testing of Sensor Observation Services: A Performance Evaluation

Experimental ResultsGetObservation (Time-BBOX)

Page 20: Testing of Sensor Observation Services: A Performance Evaluation

Experimental ResultsGetObservation (Time-BBOX)

Page 21: Testing of Sensor Observation Services: A Performance Evaluation

Conclusions

Deegree SOS performs the best in terms of response time, especially when

data requests cover larger areas, and longer temporal intervals.

MapServer SOS returns the smallest volume of data to the client.

52North SOS showed competitive performance across all types of queries,

but does not scale quite as well as Deegree SOS.

Response time could be improved by reducing the size of the response

document.

Page 22: Testing of Sensor Observation Services: A Performance Evaluation

Thank YouAny Question?