Presenter : Fen-Rou, Ciou Authors : Toh Koon Charlie Neo, Dan Ventura 2012, PRL
description
Transcript of Presenter : Fen-Rou, Ciou Authors : Toh Koon Charlie Neo, Dan Ventura 2012, PRL
![Page 1: Presenter : Fen-Rou, Ciou Authors : Toh Koon Charlie Neo, Dan Ventura 2012, PRL](https://reader035.fdocuments.us/reader035/viewer/2022062814/568166ee550346895ddb4166/html5/thumbnails/1.jpg)
Intelligent Database Systems Lab
Presenter : Fen-Rou, Ciou
Authors : Toh Koon Charlie Neo, Dan Ventura
2012, PRL
A direct boosting algorithm for the k-nearest neighbor classifier via local warping
of the distance metric
![Page 2: Presenter : Fen-Rou, Ciou Authors : Toh Koon Charlie Neo, Dan Ventura 2012, PRL](https://reader035.fdocuments.us/reader035/viewer/2022062814/568166ee550346895ddb4166/html5/thumbnails/2.jpg)
Intelligent Database Systems Lab
OutlinesMotivationObjectivesMethodologyExperimentsConclusionsComments
![Page 3: Presenter : Fen-Rou, Ciou Authors : Toh Koon Charlie Neo, Dan Ventura 2012, PRL](https://reader035.fdocuments.us/reader035/viewer/2022062814/568166ee550346895ddb4166/html5/thumbnails/3.jpg)
Intelligent Database Systems Lab
Motivation
• The k-nearest neighbor pattern classifier is an
effective learning algorithm, it can result in large
model sizes.
![Page 4: Presenter : Fen-Rou, Ciou Authors : Toh Koon Charlie Neo, Dan Ventura 2012, PRL](https://reader035.fdocuments.us/reader035/viewer/2022062814/568166ee550346895ddb4166/html5/thumbnails/4.jpg)
Intelligent Database Systems Lab
Objectives
• The paper present a direct boosting algorithm for the
k-NN classifier that creates an ensemble of models
with locally modified distance weighting to increase
the accuracy and condense the model size.
![Page 5: Presenter : Fen-Rou, Ciou Authors : Toh Koon Charlie Neo, Dan Ventura 2012, PRL](https://reader035.fdocuments.us/reader035/viewer/2022062814/568166ee550346895ddb4166/html5/thumbnails/5.jpg)
Intelligent Database Systems Lab
Methodology - Framework
AdaBoost
v = {+, }
Dzxi
x
![Page 6: Presenter : Fen-Rou, Ciou Authors : Toh Koon Charlie Neo, Dan Ventura 2012, PRL](https://reader035.fdocuments.us/reader035/viewer/2022062814/568166ee550346895ddb4166/html5/thumbnails/6.jpg)
Intelligent Database Systems Lab
Methodology
Sensitivity data order - Randomize - Batch update
![Page 7: Presenter : Fen-Rou, Ciou Authors : Toh Koon Charlie Neo, Dan Ventura 2012, PRL](https://reader035.fdocuments.us/reader035/viewer/2022062814/568166ee550346895ddb4166/html5/thumbnails/7.jpg)
Intelligent Database Systems Lab
Methodology
Sensitivity data order - Randomize - Batch update
![Page 8: Presenter : Fen-Rou, Ciou Authors : Toh Koon Charlie Neo, Dan Ventura 2012, PRL](https://reader035.fdocuments.us/reader035/viewer/2022062814/568166ee550346895ddb4166/html5/thumbnails/8.jpg)
Intelligent Database Systems Lab
Methodology
Voting mechanism - simple voting - error-weigh voting
![Page 9: Presenter : Fen-Rou, Ciou Authors : Toh Koon Charlie Neo, Dan Ventura 2012, PRL](https://reader035.fdocuments.us/reader035/viewer/2022062814/568166ee550346895ddb4166/html5/thumbnails/9.jpg)
Intelligent Database Systems Lab
Methodology
Condensing model size - optimal weight - average the weight
![Page 10: Presenter : Fen-Rou, Ciou Authors : Toh Koon Charlie Neo, Dan Ventura 2012, PRL](https://reader035.fdocuments.us/reader035/viewer/2022062814/568166ee550346895ddb4166/html5/thumbnails/10.jpg)
Intelligent Database Systems Lab
Experiments
![Page 11: Presenter : Fen-Rou, Ciou Authors : Toh Koon Charlie Neo, Dan Ventura 2012, PRL](https://reader035.fdocuments.us/reader035/viewer/2022062814/568166ee550346895ddb4166/html5/thumbnails/11.jpg)
Intelligent Database Systems Lab
Experiments
![Page 12: Presenter : Fen-Rou, Ciou Authors : Toh Koon Charlie Neo, Dan Ventura 2012, PRL](https://reader035.fdocuments.us/reader035/viewer/2022062814/568166ee550346895ddb4166/html5/thumbnails/12.jpg)
Intelligent Database Systems Lab
Experiments
Fig 8. Boosted k-NN with randomized data order.Fig 9. Boosted k-NN with batch update. Fig 10. Boosted k-NN with error-weighted voting.Fig 11. Boosted k-NN with optimal weights. Fig 12. Boosted k-NN with average weights.
![Page 13: Presenter : Fen-Rou, Ciou Authors : Toh Koon Charlie Neo, Dan Ventura 2012, PRL](https://reader035.fdocuments.us/reader035/viewer/2022062814/568166ee550346895ddb4166/html5/thumbnails/13.jpg)
Intelligent Database Systems Lab
Experiments
![Page 14: Presenter : Fen-Rou, Ciou Authors : Toh Koon Charlie Neo, Dan Ventura 2012, PRL](https://reader035.fdocuments.us/reader035/viewer/2022062814/568166ee550346895ddb4166/html5/thumbnails/14.jpg)
Intelligent Database Systems Lab
Conclusions• The Boosted k-NN can boost the generalization
accuracy of the k-nearest neighbor algorithm.
• The Boosted k-NN algorithm modifier the decision
surface, producing a better solution.
![Page 15: Presenter : Fen-Rou, Ciou Authors : Toh Koon Charlie Neo, Dan Ventura 2012, PRL](https://reader035.fdocuments.us/reader035/viewer/2022062814/568166ee550346895ddb4166/html5/thumbnails/15.jpg)
Intelligent Database Systems Lab
Comments• Advantages– The paper describes rich experiment.
• Applications– classification