Mastergoal Machine Learning Environment Phase III Presentation Alejandro Alliana CIS895 MSE Project...
-
Upload
myrtle-tucker -
Category
Documents
-
view
216 -
download
1
Transcript of Mastergoal Machine Learning Environment Phase III Presentation Alejandro Alliana CIS895 MSE Project...
Mastergoal Machine Learning Environment
Phase III PresentationAlejandro Alliana
CIS895 MSE Project – KSU
MMLE Project Overview Provide an environment to create,
repeat, save experiments for creating strategies for playing Mastergoal using ML techniques.
Divided in 4 sub-projects. Mastergoal (MG) Mastergoal AI (MGAI) Mastergoal Machine Learning (MMLE) User Interface (UI)
Phase III Artifacts
User Documentation Component Design Source Code (and exe) Assessment Evaluation Project Evaluation References
User Documentation
MMLE UI User manual provided. Install Common use cases
Data fields types and formats Examples
MG, MGAI, MMLE libraries. API generated by doxygen.
Component Design
Project divided in 4 main subprojects. UI uses MMLE, MGAI and MG as
libraries.
Component Design – Design Patterns Used Factory Method: Board, Search Algorithm. Prototype: Strategy. Singleton: All factories, Terms, Fitness
Functions, Selection Criteria, Termination Criteria.
Template Method: Terms, Strategy. Strategy: Search Algorithms – Agents,
Strategy – Search Algorithms. Observer: GameSubject – GameListener,
TrainSubject – TrainListener. Proxy: TrainBridge, UIAgentProxy
Component Design
Deployment Diagrams show packages and files.
Component Design
Deployment diagrams.
Class diagrams Sequence
diagrams. Object Diagrams. Short description
of classes and link to API online documentation.
Source Code
Kept in SVN repository (7 projects). 4 sub-projects. 3 test sub-projects.
Metrics were taken weekly and will be discussed later in the presentation.
Installer and executable. Installer created with NSIS (Nullsoft
Scriptable Install System). UI created with the Windows Forms
GUI API available in the .NET Framework.
All other sub-projects coded in (unmanaged) C++ and are available as libraries.
MMLE Demonstration.
Assessment Evaluation I used the CPPUnit framework to
perform unit testing on the projects. MastergoalTest MastergoalAiTest Mmle-test
Assertions used to test for pre and post-coditions.
I used the Visual Leak Detector system to detect memory leaks .
Assessment Evaluation
Test Plan All test passed*
CPPUnit Regression Bugs. Coding of test cases. Document and debug test cases. Memory Leak Bugs.
Assessment Evaluation – Defect Log.
Size of the test projects. Overall the three test projects
have 1125 lines of code.
Size of each Project (LOC)
0500
100015002000250030003500400045005000
mas
tergo
al
mas
tergo
al-test
mas
tergo
alAI
mas
tergo
alAIT
est
MM
LE
MM
LE-T
est
UI
LO
C
Assessment Evaluation - Metrics
Chart Title
Elaboration, 155
Implementation, 6141
0 1000 2000 3000 4000 5000 6000 7000
Implementation
Elaboration
Implementation 6141
Elaboration 155
Assessment
Sum of Delta _Min
WorkflowPhase
Assessment Evaluation - Metrics
Chart Title
Documentation, 475
Test Assesment, 136
Test Plan, 155
Testing, 5530
0 1000 2000 3000 4000 5000 6000
Testing
Test Plan
Test Assesment
Documentation
Testing 5530
Test Plan 155
Test Assesment 136
Documentation 475
Assessment
Sum of Delta _Min
Workflow
Task
Project Evaluation
Souce code metrics
02000400060008000
100001200014000
1/16
/200
8
1/23
/200
8
1/30
/200
8
2/6/
2008
2/13
/200
8
2/20
/200
8
2/27
/200
8
3/5/
2008
3/12
/200
8
3/19
/200
8
3/26
/200
8
4/2/
2008
4/9/
2008
4/16
/200
8
4/23
/200
8
4/30
/200
8
Time
NOM
LOC
COM
Project Evaluation
Size of each Project (LOC)
0500
100015002000250030003500400045005000
mas
tergo
al
mas
tergo
al-test
mas
tergo
alAI
mas
tergo
alAIT
est
MM
LE
MM
LE-T
est UI
LO
C
Project Evaluation
Metrics 533 hours (13.3 weeks or 3 months over a
period of 10 months) and 11 KLOC. Estimations
FP: Time 10.79 months, 2.79 KLOC. COCOMO: Time 9.24 months, 7.5 KLOC. COCOMO II: Time 9.54 months, 7.5 KLOC.
Project Evaluation - FP Real
11 KLOC and 3 months. Estimates of Function Points
Size 2.79 KLOC, Time 10.79 months. Lack of experience using FP. Some of the user interfaces were more
complex than previously thought No .NET conversion rates A big part of the project is the user interface
which contains automatically generated code Algorithms not well represented.
Project Evaluation - COCOMO Real
11 KLOC and 3 months. Estimates of COCOMO.
Actual size is arbitrary, based on experience. Size 7.5 KLOC, time = 9.25 months Inexperience in C++/ .NET. Conversion rates of the languages.
Estimates of COCOMO II Application Composition model 5.57 Person
months (Object Points / Productivity) Post Architectural Model 9.54 months (7.5
KLOC)
Project Evaluation - Time spent at each phase.
Total
Total, 4814
Total, 21473
Total, 5694
Elaboration
Implementation
Inception
Sum of Delta _Min
Phase
Project EvaluationChart Title
Assessment, 6296
Design, 4144
Environment, 598
Implementation, 14935
Infrastructure, 12
Management, 2686
Planning, 384
Requirements, 645
Research, 2267
(blank), 14
0 2000 4000 6000 8000 10000 12000 14000 16000
Sum of Delta _Min
Workflow
Project Evaluation - Lessons Learned
Implementation: C++ language, memory
management, implementation of design patterns.
Tools and libraries (NSIS, CPPUnit, VLD, Doxygen)
Design: Design Patterns.
Project Evaluation - Lessons Learned Experience on various estimate models. Measurement
Tools (CCCC, Process Dashboard). Testing.
CPPUnit framework. VLD.
Process Iterative process Artifacts
Project Evaluation - Future work Improve performance of search algorithm
and add new algorithms. Add more functionality to the game
playing library and UI. Add more selection mechanisms to the
GA Experiments. Add more learning algorithms. Distributed computation to speed up
training Refactoring of some classes. Add test classes for each feature.
Project Evaluation Results
Learning Level I
0
0.2
0.4
0.6
0.8
1
1.2
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25
Generations
Fit
nes
s max
avg
Project Evaluation Results
Learning Level II
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
1 3 5 7 9 11 13 15 17 19 21 23 25
Generations
Fit
nes
s max
avg
Project Evaluation Results
Learning level III
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
1 2 3 4 5 6 7
Generations
Fit
nes
s
max
avg
Tools Used MS Visual Studio 2005 BoUml Rational Software Architect NSIS (Nullsoft Scriptable Install System) CCCC (C and C++ Code Counter) TinyXML Visual Leak Detector Doxygen Process Dashboard TortoiseSVN
References Design Patterns: Elements of Reusable Object-Oriented Software,
Gamma, Erich; Richard Helm, Ralph Johnson, and John Vlissides (1995). Addison-Wesley. ISBN 0-201-63361-2.
Machine Learning, Tom Mitchell, McGraw Hill, 1997 ISBN 0-07-042807-7 BoUML http://bouml.free.fr/ Rational Software Architect
http://www-306.ibm.com/software/awdtools/architect/swarchitect/ http://sourceforge.net/projects/tinyxml/ NSIS (Nullsoft Scriptable Install System)
http://nsis.sourceforge.net/Main_Page CCCC (C and C++ Code Counter) http://sourceforge.net/projects/cccc CPPUnit
http://cppunit.sourceforge.net/doc/lastest/cppunit_cookbook.html TinyXML http://www.grinninglizard.com/tinyxml/ Visual Leak Detector available at http://dmoulding.googlepages.com/vld Doxygen http://www.stack.nl/~dimitri/doxygen/ Process Dashboard http://processdash.sourceforge.net/ TortoiseSVN http://tortoisesvn.tigris.org/