Learning-based Localization in Wireless and Sensor Networks
description
Transcript of Learning-based Localization in Wireless and Sensor Networks
-
Learning-based Localization in Wireless and Sensor NetworksJeffrey Junfeng Pan Advisor: Qiang Yang Department of Computer Science and EngineeringThe Hong Kong University of Science and Technology
HKUST
Signal-Strength-Based LocalizationWhere is the Mobile Device?
-30dBm-70dBm-40dBm
HKUST
Locations Support Many ApplicationsGuidance, Content Delivery & User Behavior Analysis
HKUST
Cheap and UbiquitousReceived-Signal-Strength ?nonlinearnoisy
HKUST
Related WorksPropagation-based Models (Rely on AP Locations)Log-Normal model -- Bayesian indoor positioning systems. Maligan et al. INFOCOM 2005. Multilateration Dynamic fine-grained localization in ad-hoc networks of sensors. MOBICOM 2001
Learning-based Models (Not Need AP Locations)KNN -- LANDMARC: Indoor Location Sensing Using Active RFID. Ni et al. PerCom 2003. / RADAR etc. ML -- Large-scale localization from wireless signal strength. Letchner et al. AAAI 2005.SVM -- A kernel-based learning approach to ad hoc sensor network localization. Nguyen et al. ACM Transaction on Sensor Networks. 2005.
HKUST
Related WorksPropagation-based ModelsPath Loss [Goldsmith et al.]power radiation
Shadowing [Maligan et al.]absorption, reflection, scattering, and diffraction
Indoor Attenuation Factors [Bahl et al.]floors and walls
Multipath [Goldsmith et al.]ray-tracing, need more detail about environment1) Ranging: Signal Distance2) Multi-Distance (x, y)
HKUST
Question OneHow can we build an accurate mapping from signals to locations (bypass ranging) ?* It is not easy to parameterize an indoor environment, (wall material, building structure, etc.)
HKUST
Related WorksLearning-based ModelsTwo phases: 1) offline Training and 2) online LocalizationOffline phase collect data to build a mapping function F from signal space S(AP1,AP2,AP3) to location space L(x,y)
Online phase given a new signal s , estimate the most likely location l from function Fs = (-60,-49,-36) dB , compute l=F(s) as the estimated locationMapping function FTraining
HKUST
Learning-based Models (cont)Manually SetupHORUS/RADAR/Walk to several points Collect data manuallyCost timeSemi-automatically SetupLANDMARCMark tag position manually Collect data automaticallyCost money
HKUST
Question TwoHow can we reduce calibration effort?* We need to collect a lot of data at many locations
HKUST
Question Three & FourCan a learning-based model benefit from calibrated access points?Can a learning-based model work purely online for adaptation?*Propagation models use AP locations while learning models dont.*Learning models usually function in two phases: offline/online
HKUST
Our ContributionLocalization Models (a general framework)A Flexible Model for Localization and MappingIncrease accuracy with known-location clients or APsReduce calibration with unknown-location clients or APsCan work offline/online or purely online for adaptation
Localization Experiments (thorough study)Devices: WiFi, Sensor Networks, RFID NetworksTest-bed: Hallways, Indoor open space, 2D & 3DMobility: Static, Moving persons, Moving robots.
HKUST
Question OneCan we increase the accuracy when some labelled* (calibrated) data are available?*A Labelled (calibrated) example is an input/output pairExample: (-60dBm,-50dBm,-70dBm) => (x,y)
HKUST
Observation of Signal StrengthCharacteristics (statistically)Two rows are similar Two mobile devices are close (tA & tA)However, when observing individual noisy data pointsSimilar signals may not be nearby locationsDissimilar signals may not be far awayA user with a mobile device walks through A B,C,D,E,F(x1,y1)(x2,y2)(x3,y3)(x4,y4)(x5,y5)(x6,y6)(x7,y7)
HKUST
Motivation of Our ApproachIdea: Maximize the similarity correlation between signal and location spaces
HKUST
(Kernel) CCACanonical Correlation Analysis (CCA)[H. Hotelling, 1936]Two data set X and YTwo linear Canonical Vectors Wx WyMaximize the correlation of projections Kernel CCA[D.R Hardoon, S. Szedmak, and J. Shawe-Taylor, 2004]Two non-linear Canonical Vectors
K is the kernel
HKUST
LE-KCCAOffline phaseSignal strengths are collected at various grid locations.KCCA is used to learn the mapping between signal and location spaces.is and is are obtained from the generalized eigen-problemis a regularization term
For each training pair (si, li), its projections on the T canonical vectors are obtained from
HKUST
LE-KCCA (Cont)Online phaseAssume the location of a new signal strength vector is sAgain, use to project s onto the canonical vectors and obtain Find the K Nearest Neighbors of P(s) in the projections of training set with the weighted Euclidean distance : Interpolate these neighbors locations to predict the location of s Essentially, we are performing Weighted KNN in the feature space with which weights are obtained from the feedback of location information.
HKUST
Experimental SetupTest-bed Department of Computer Science and Engineering Hong Kong University of Science and Technology 99 locations (1.5 1.5 meter) 100 samples per location 65% for training, 35% testing Repeat each experiment 10 times
HKUST
Experimental ResultHow we use data set65% training35% testing10 repetitionError distance is 3.0mLE-KCCA 91.6%SVM 87.8%MLE 86.1%RADAR 78.8%
HKUST
Question TwoLabelled data are expensive to get(-60dBm,-50dBm,-70dBm) => (x,y)Unlabelled data are easy to obtain(-60dBm,-50dBm,-70dBm)Can we reduce calibration effort by using additional unlabelled (uncalibrated) data?
HKUST
Observation of Signal StrengthCharacteristics (statistically)Two rows are similar Two mobile devices are close (tA & tA)Neighbored rows are similar User Trajectory is smooth (ti & ti+1)Basic IdeaBridge labelled and unlabelled dataA user with a mobile device walks through A B,C,D,E,F
HKUST
Manifold RegularizationBasic AssumptionIf two points are close in the intrinsic geometry (manifold*) of the marginal distribution, their conditional distributions are similarClassification Similar LabelsRegression Similar ValuesInfer the unlabelled data byTaking a look at the neighbored points*A manifold can be understood as a curve surface in a high dimesnional space
HKUST
Manifold Regularization (Cont)The Objective is to Optimize,
HKUST
The LeMan AlgorithmOffline Training PhaseCollect l labelled and u unlabelled signal examplesConstruct graph Laplacian L Kernel Matrix KSolving for Online Localization PhaseOnline phase time complexity O((l+u)*N) where N is the number of sensors
HKUST
Experimental Setup
HKUST
Location Estimation AccuracyLeMan has the smallest mean error distance 67cmLeMan is robust. The std. of error distance is 39cmLeMan needs more computation time. 0.242ms per location on 3.2GHz CPU on Matlab.
HKUST
Vary Labelled and Unlabelled Data
HKUST
Question ThreeCan we employ further information source, e.g., access point locations?*Propagation-based models use access point locations*Learning-based models do not use their locations
HKUST
What Kind of Data We Have?The Location of Mobile DevicesKnown when walking by landmarks (corners, doors)Unknown elsewhereThe Location of Access PointsKnown for those deployed by usUnknown for those deployed by other personsKnown(x1,y1)Known (x3,y3)Known (x, y)Unknown (-, -)Unknown (-, -)
HKUST
Observation of Signal StrengthCharacteristics / ConstraintsTwo rows are similar Two mobile devices are close (tA & tE)Neighbored rows are similar User Trajectory is smooth (ti & ti+1)Two columns are similar Two access points are close (AP1 & AP4)Strong cell mobile device and access point are close (tD at AP3)A user with a mobile device walks through A B,C,D,E,F
HKUST
Dimension ReductionIdea - Latent Semantic Indexingn Termn Access Pointm Docm Mobile DeviceSVDSVDTerm Access PointDocument Signal Fingerprint0Doc_1Doc_2Doc_3vehicle (query)?
Doc\TermmooncartruckDoc_1100Doc_2021Doc_3012
HKUST
Solution of Latent Semantic IndexingTransform signal matrix to weight matrix
Normalize the weight matrix
Recover the relative coordinates by SVD
Notationm mobile devices, n access point, r=2 dimension
HKUST
Illustration of Latent Semantic IndexingRetrieve Relative Coordinates / Recover AP locations as wellWell Alignment between Mobile Device and Access Points
HKUST
Dimension ReductionEncode Labels by Manifold LearningNeighbors in Locations Neighors in Signals ?Construct K-Neighborhood Graph for Manifold
HKUST
Dimension ReductionEncode Labels by Manifold LearningOffline Training Phase (Give labels to unlabeled data)Optimal locations of mobile devices and access points
Online Localization PhaseUse the Property of Harmonic Functions (~Weighted KNN)Encode LabelsSignal Manifold
HKUST
Dimension ReductionEncode Labels by Manifold LearningW =L = D - W
HKUST
Co-Localization Example802.11 WLAN Test-bedCo-Localization Result
HKUST
Experimental Setups802.11 Wireless LAN (WLAN)Wireless Sensor Network (WSN)Radio-frequency identification (RFID)
HKUST
Tests on WLAN / WSN / RFIDDifferentTest-bedsLocateMobile DevicesLocateAccess Points
HKUST
Question FourThe previous models are operated in a traditional offline/online mannerHow to adapt new data without re-training everything?Can we update the model online?
HKUST
Online Co-LocalizationPredict StepUse Weighted KNN as initial estimation of the CoordinateUpdate StepUpdate the K-Neighborhood in ManifoldUpdate Coordinate Iteratively by
HKUST
Experimental Setups802.11 Wireless LAN (WLAN)
HKUST
Online Co-Localization [movie]
HKUST
Model Update Speed10 times faster than its two-phase counterpartAccuracy is the same as the two-phase method
HKUST
SummaryRadio-Signal-Strength-based TrackingRSS-based TrackingApplication ScenarioRadio CharacteristicsLocalization ModelsIncrease AccuracyLeKCCA (IJCAI-2005)Reduce CalibrationLeMan (AAAI-2006)Encode Further Information SourcesCo-Localization (IJCAI 2007)Update Model OnlineOnline Co-Localization (AAAI-2007)Conclusion
HKUST
Development of Our ModelsLeKCCA [IJCAI05]LeMan [AAAI06]Co-Localization [IJCAI07]Online Co-Localization [AAAI07]Flexibility and PowerfulnessTwo-Phase, Supervised Mobile DeviceTwo-Phase, Semi-Supervised Mobile DeviceTwo-Phase, Semi-Supervised, Mobile Devices and Access PointsOnline, Semi-Supervised, Mobile Devices and Access PointsModel Development
HKUST
The EndThank YouQuestion ?