Post on 08-Dec-2014
description
1
Presented byBWM Product Development Process
Target Process Model – Single Team
2
Time
June July August September October November
Choosing, Planning, Building & Releasing
Product 1 Product 2 Product 3
Plan Product 1
Plan Product 2
Plan Product 3
Plan Sprint b
Plan Sprint c
Plan Sprint d
Plan Sprint a
Plan Sprint b
Plan Sprint c
Plan Sprint d
Plan Sprint a
Plan Sprint b
Plan Sprint a
Demo Sprint a
Demo Sprint b
Demo Sprint c
Demo Sprint a
Demo Sprint b
Demo Sprint c
Demo Sprint a
Release Product 1
Release Product 2
Release Product 3
Choose Product 2
Choose Product 3
Each arrow represents a scheduled EventWeekly Status meeting for each teamExecutives must participate in “Choose” mtg
Weekly Status
3
Email team
Operations team
Merchant/Backend team
Partner services team
Consumer web team
• Products developed in parallel• Value focus built into organizational structure (team value
stream)• Teams more focused on value stream than on functional
specialization
Target Process Model – Multiple Teams
Product Focus Areas – (Map to Teams)
4
Partners REMOVE
• Partner platform• API• Widgets• Co-branded
exp• Portal
• Social Media apps
• SEO platform
• Landing page testing platform
Consumers
• Core web exp• Deal exp• Account
management• Offer
dashboard
• 2nd ary offer market
• Loyalty/reten-tion programs
• Mobile/smart-phone exp
Merchants
• Merchant portal• ROI calc &
deal templates
• Deal execution
• Deal postmortem
• Redemption technologies• POS
integration
• Deal production platform
• Admin platform• Sales
support• CS support
• Business Intelligence
• Offer Science
• Email communications
Operations
Teams
5
Teams & Emphasis
6
• Right now we have• Team 1 – core consumer experience• Team 2 – operations• Team 3 – merchant experience/backend
• As we grow our team size through March … • Team 1 – deal merchandising, exp, & fulfillment• Team 2 – operations • Team 3 – backend• Team 4 – consumer exp, social, & mobile access• Team 5 - merchant experience
• Teams consist of at least 1 engineer, ½ designer, ½ product manager
• Lean, feedback driven, 2-week iteration based approach
Target– team composition
7
•Teams are cross-functional
•Teams are not ‘owned’ by an individual
•Teams have the freedom to choose approaches, but not objectives.
•Teams are accountable for the results of their choices
• Product Management– (opportunity & solution
identification, product performance against kpis)
• Visual and interactive design– (usability analysis, solution
identification & definition)
• Software development– (solution definition, design, &
construction)
• Quality assurance– (solution definition &
correctness)
• Process facilitation– (process effectiveness
oversight)
Target– governance
8
• Teams will publish calendar with fixed product & sprint complete dates.
• Product release dates are refined as sprints are completed. If they are fixed, scope is managed appropriately
• Each team will publish calendar with fixed meeting dates
• Weekly sprint progress will be published
• Executives will be invited to choice meetings
• Each team will conduct demonstrations of built functionality at the end of every sprint (~2 weeks) and prior to release
• Information published using SharePoint (or another document management system)
Target- portfolio & roadmap
9
• Manage product roadmaps & portfolio based on available teams• Each product category or value stream requires a
roadmap.• Each team requires a roadmap• If fewer teams than product categories, group
product categories and combine roadmaps
• Mapping teams to value streams allows the team to drive against quantifiable success criteria associated with value stream• e.g. visitor to subscriber conversion• e.g. # purchases per session per subscriber• e.g # of purchased *offer science* recommended
deals• e.g. # of subscribers per API partner
Partner Services & API
10
Provide the necessary platform to1.Extend BWM marketplace into partner properties and communities
2.Syndicate deals3.Customize core consumer experience for partner community members
Objective• Maximize [#subscribers] per
partner per unit time• Maximize [ARPS ($ per sub)]
per partner per unit time• Minimize [integration time] per
partner
KPIs
• API• Widgets & JavaScript/HTML
objects• Partner tools & analytics• Automated Partner signup• Partner deal management
capabilities
Areas of influence/roadmap• Product expert - <tbh>
• Interactive/visual designer – <tbh>
• Software development – <tbh>,<tbh>
• ½ Quality assurance – <tbh>
Team
Consumer Web Experience
11
Create an amazing web based experience for users that1.Showcases deals elegantly2.Creates user opportunities for new
life-style experiences3.Puts the right deal in front of the
right user at the right time
Objective• Maximize [#subscribers LTV]• Minimize [#subscribers
churn/month] ratio• Maximize
[#purchases/subscriber] ratio• Maximize [csat/net promoter
score per user] ratio
KPIs
• Deal discovery, experience, & participation
• Community experience• Account management• Offer Dashboard• Secondary market
Areas of influence/roadmap• Product expert - <tbh>
• Interactive/visual designer – <tbh>
• Software development – <tbh>, <tbh>
• Quality assurance – <tbh>
Team
Merchant Services
12
Create tools & services to provide merchants 1.Easy ways to predict campaign
value2.Amazing campaign production tools3.Analytics for assessing campaign
performance4.Consumer insight & marketing tools
Objective• Maximize [#merchants]• Maximize [#deals/merchant]
ratio• Maximize [commision$/deal]
ratio• Maximize [merchant LTV]
KPIs
• Merchant Portal• BoostYourBusiness and associated
web collateral• Referral features• Conversion funnels
Areas of influence/roadmap• Product expert - <tbh>
• Interactive/visual designer – <tbh>
• Software development – <tbh>. <tbh>
• ½ Quality assurance – <tbh>
Team
Backend & Admin Services
13
Provide the tools and services necessary to support effective and efficient BWM business operations
Objective• Minimize[time to produce deal] • Minimize[time to respond to
customer inquiry] ratio
KPIs
• Sales Production tools• Customer Service tools• Financial Operations support
Areas of influence/roadmap• ½ Product expert - <tbh>
• Software development – <tbh>, <tbh>
• ½ Quality assurance – <tbh>
Team
Business Intelligence
14
To develop and report on a collection of measures that provide meaningful insight into the health and performance of the business and that provide support for the marketing, merchandising, sales, product, & executive function
Objective• Maximize [#positive outcome
decisions/#published KPIs] ratio
KPIs
• Data warehouse• Web analytics• Other internal & external data-
sources relevant to our business operations
Areas of influence/roadmap• Database/Report development –
<tbh>,<tbh>• Software development – <tbh>• Business analytics – <tbh>
Team
Email & Subscriber Communications
15
• Drive member engagement through active communication of
relevant deals• Maximize the value of each
subscriber communication• Leverage email to extend base
service offerings
Objective• Maximize [Commission $/1000
sends] ratio• Maximize [# clicks/1000 sends]
ratio• Maximize [# opens/1000 sends]
ratio• Maximize [# inboxs/1000 sends]
KPIs
• Transactional Emails• Email campaigns• Email management tools• Deliverability management tools• Data extraction associated with
email campaigns
Areas of influence/roadmap• CRM expert - <tbh>
• Visual designer – <tbh>• Software development – <tbh>,
<tbh>• ½ Quality assurance – <tbh>
Team
Core Services
16
ObjectiveMaximize [#releases/month]
Maximize [#items/release]
Minimize [#bugs/release]
Maximize [$ value/release]
KPIs
• Pixel and landing page optimizations
• Bug fixes & enhancements• Admin app support• Web production & publishing• Payment system changes
Areas of influence/roadmap• Product expert - <tbh>
• Interactive/visual designer – <tbh>
• Software development – <tbh>, <tbh>
• ½ Quality assurance – <tbh>
Team
Support must do and discretionary short term product releases that support ongoing business operations on a repeating 2 week release cycle
Proactively recommends optimizations against key metrics across ALL business concerns
Choosing & defining
17
• Ideas come from everywhere– Product roadmap– Technology roadmap– Business development– Customer feedback– Funnel optimization analysis– Metric review– Competitive research– Prototyping and brainstorming
• How do we decide which ones are the most valuable?
• How do we decide which ones to do?
Choosing what to do
• Release targets are fixed and occur on a recurring basis for each team
• This schedule puts pressure on the ‘ideation process’ (product identification and prioritization)
• Each team must have a “backlog” of product ideas ready to build
• The process of generating, capturing, and evaluating ideas must be as disciplined and normal as the process of implementing them
Choosing what to do
Sprint & release cycle (building)
Identification & evaluation cycle (choosing)
sprint 1 sprint 2 sprint 3 sprint 4
release 1 release 2 release 3
plan release 2 plan release 3 plan release 4
= concept doc
Choice Cycle
• Teams are empowered to work with executive management in order pick the products or projects that will have the greatest impact on their value stream
• During a release cycle, the team’s product manager must identify and evaluate 2-5 items as candidates to be built during the next release
• Identification primarily involves picking opportunities to realize or problems to solve
Identifying
• Opportunities or problems to address can come from at least the following places– Roadmap– Internal business intelligence– Business development opportunity– Competitive research & analysis– Market study research & analysis– Customer feedback– *intuition*
• An opportunity is identified once a ‘concept document’ has been written– 1- 2 page document with an opportunity description
(quantitative), an objective section, and a candidate solution (with cost) section
– Should provide enough information to evaluate and prioritize item
Identifying
• Choosing what to do is a critical decision process at the end of which a commitment is made to invest time and money into an endeavor with an uncertain outcome
• Consequently, the team must be joined by the appropriate set of executive stakeholders when choosing what items to tackle in the upcoming release
• Choosing and prioritization occurs at the “concept review meeting” which takes place at least 1 week prior to the end of the current release
Choosing
• In the concept review meeting the conceptual docs for the candidate initiatives are presented and discussed
• At the end of the concept review meeting 1 or more projects are chosen to be built in the upcoming release. – If more than 1 project is chosen, a priority order
is defined
• Multiple methods for choosing may be employed– Simple ROI– NPV– IRR– Scorecards– Qualitative analysis
Choosing
• Once a project has been chosen it must be ‘minimally’ elaborated before the start of the next release.
• Elaborating a project is known as ‘release planning’ or creating the ‘product backlog’
• It is very different from creation a PRD; It is a simple list
• With a ‘product backlog’ the objective is not completeness
• Items in a product backlog are called ‘user stories’ or ‘features’
Defining
Defining
• User stories or features are simple descriptions of application behavior or user interactions
• They are intended to be place holders for further conversation
• Each must have a unique value and a time estimate
Priority Feature Value points
Effort points
How to demonstrate
Notes
Defining
Planning
• Releases are divided into sprints
• During sprints, the team decides how many features or user stories it believes it can accomplish in a fixed amount of time
• During sprints, the team elaborates the features covered in the sprint
Release (29 features)
Sprint 1 (12 features)
Sprint 2 (17 features)
Sample release plan
4 weeks 4 weeks
Releases
Building
28
Process model
BacklogFeatures assigned to iteration
Backlog items expanded by team
New Functionality is demonstrated at the end of iteration
Every 24 hours
30-day sprint
Scrum15 – minute daily meeting
Team members respond to basics
- What did you do since last meeting?- any obstacles?- What will you do before next meeting?
Product BacklogPriority 1Priority 2Priority 3
DefineBuildTestPull top
itemFail
Pull next
Evaluate
Sprint
Scrum
• The heart beat of agile product development is the ‘sprint’
• A release consists of 1 or more sprints
• The team delivers an integrated, release ready set of features at the end of a sprint.
• Features and user stories built during a sprint are taken from the ‘product backlog’
• A sprint is composed of the highest priority items from the backlog that the team believes can be accomplished within the sprint time frame
Sprinting/Iterating
• Sprints are always time boxed
• At the beginning of a sprint the team engages in ‘sprint planning’ and associates tasks (each task has ‘effort points’) with each user story or feature in the sprint
• The most important user stories or features are built first
• Features are defined, built, and tested by the team one at a time
• Once a sprint is fixed, only the team can add new features
Sprinting
• The daily standup or ‘scrum’ is the primary vehicle for team communication
• The goal is for the team to remove any obstacles preventing it from succeeding in delivering the sprint– Feature ambiguity or complexity– Technical ambiguity or complexity– Lack of critical information– External distractions– Momentum problems
• The daily standup meeting takes place every day and must be attended by all members on the team
Daily Stand-up
• Roles– Scrum facilitator – facilitates scrum– Product expert – defines and refines application behavior– Team – realizes application behaviors
• Daily standup is a realization of ‘empirical process control’– Visibility – intermediate process steps are visible– Inspection – intermediate results can be evaluated– Adaptation - changes can be made as a result of visibility
& inspection
• In this way the product solution can be achieved through a series of small activities, each which can be measured against objectives
Daily Stand-up
• Story boards and burn down charts are the ‘vehicles’ upon which the sprint and daily stand-ups rely
• Story boards list user stories or features on index cards and categorizes them as ‘not started’, ‘in progress’, & ‘completed’
• Through the course of the sprint, index cards migrate from ‘not started’ to ‘completed’
• The burn down chart captures this progress in a, hopefully downward sloping, two dimensional graph
Story Boards & Burn-down Charts
Story boardsNot Started In Progress Completed
Purchase flow
Account History
Not Started In Progress Completed
Purchase flow >1 trk
Purchase flow 1 trk
Buy btn – visit mode
Mar com – splash pg
Omniture tracking
Buy btn – LFM mode
Account History
Sprint day 1
Sprint day 22
Story Boards
63
7468
6456
4941
31 29 32 32 32 32 32
0
10
20
30
40
50
60
70
80
0 1 2 3 4 5 6 7 8 9 10 11 12 13
effort
poin
ts
days in sprint
Burn down chart -Sprint 2
Sprint 2 is a two week sprint which started with 11 features and 63 effort points.
Do we have a problem?
Burn-down Charts
Testing & releasing
37
• Traditionally, test and development organizations are separate
• Traditionally, development hands off large amounts of theoretically working, but largely untested code to a test organization
• With agile, testing and development are peas in a pod
• With agile, tests are developed concurrently with coding and requirements generation
Testing in Agile is Different
• Features or user-stories are validated as they move from ‘not started’ to ‘completed’
• The define/build/test cycle occurs on a feature by feature basis
• The embedding of a ‘test driven’ mentality throughout the whole ‘build’ process results in systems which are built to be verified
• This leads to higher overall quality
Testing in Agile is Different
• All code is tested code
• Teams get no credit for delivering functionality that has been coded but not tested
• Tests are written before, or concurrently with, the code itself
• Testing is a team effort; testers, developers, and product experts all write tests
• Automation is the rule, not the exception
Agile Testing Principles
Agile testing strategyUnit testing Acceptance
testingComponent testing
System & performance testing
Developers write unit tests for every module and method
Testers/product experts write functional or acceptance tests for each feature
Automated builds assemble all system components into a daily system build
Each unit test returns ‘pass’/’fail’ against a specific build
Acceptance tests are elaborated and written during sprint planning and execution
Developers and testers write component level tests
Unit, component, acceptance, and regression tests are run against the daily build
All unit tests must pass before code is checked into source control
Acceptance tests are run during the sprint and serve as check points for features
Component tests are run during the sprint to ensure the “system still runs”
Developers and Testers create performance and stress tests to identify boundary conditions
Automated unit tests are run frequently against an integrated build
Acceptance tests are automated where possible
New component tests are linked into the automated regression test suite
These tests are run daily
Measuring Success
42
• Success in a world of value stream orientation is the creation of value
• Each team is successful as a unit if it positively impacts its KPIs
• Working software products that conform to their intended purpose is the first necessary condition for success
• Discovering the right products through frequent release and measurement is the other necessary condition for success
Success
• Every member of the team is exposed to the performance of the products or projects they are building through a team dashboard
• Success is NOT– Completing the step in a work order– A PRD or a technical specification– Adherence to internal milestone dates– A wireframe– A complete test plan
• Positive performance of products is the primary measure of success for the entire team
Success
• Although the primary measures of success are direct product performance metrics, other internal efficiency and effectiveness measures are valuable
• Project metrics allow the team to perform project kaizen (to get better at doing)
• There are two kinds of project metrics– Sprint assessment metrics– Release assessment metrics
Metrics
Sprint assessment metricsProject Team:
Functionality Sprint 1 Sprint 2
# stories (loaded at beginning of sprint)
# accepted (defined/built/tested & accepted)
% accepted
# pushed to next Sprint
# not accepted: deferred to later date
# not accepted: deleted from backlog
# added during Sprint (should be 0)
Quality and test automation Sprint 1 Sprint 2
% stories with tests
Defect count at start of iteration
Defect count at end of iteration
# new test cases
# test cases automated
Total automated tests
Release assessment metricsProject Team:
Value delivery Release 1 Release 2
# features delivered in release
# feature value points
Planned release date
Actual release date
Architecture and feature debt Release 1 Release 2
# refactorings completed
Refactoring backlog (total identified refactor targets)
Customer debt (promised features) delivered
Total customer debt features
• Project metrics help individual teams improve their own performance at the project level
• Process metrics provide global insight into process effectiveness across six dimensions– Product management capability– Release planning and tracking capability– Sprint planning and tracking– Team effectiveness– Testing practices– Development practices/infrastructure
Process Metrics
Software agility team self-assessment Score (0-5)
Comments
Backlog prioritized & ranked by biz value
Backlog estimated at gross level
Product expert defines acceptance criteria for stories
Product expert and stakeholders participate at sprint & release planning
Product expert and stakeholders participate at sprint & release review
Product expert collaboration with team is continuous
Stories sufficiently elaborated prior to planning meetings
Total Product Management Score Score (0-5)
Comments
Release theme established and communicated (concept doc)
Release planning meeting attended and effective
Product backlog defined
Product backlog ranked by priority
Team has small and frequent releases
Team has common language and metaphor to describe release
Total Product Management Score (con’t) Score (0-5)
Comments
Release progress tracked by feature acceptance
Team completes and product expert accepts by release date
Release review meeting attended and effective
Team inspects and adapts the release plan post each Sprint
Total Release Planning & Tracking Score Score (0-5)
Comments
Sprint theme established and communicated
Spring planning meeting attended and effective
Team velocity measured and used for planning
Sprint backlog defined & ranked by priority
Team develops and manages Sprint backlog
Team defines, estimates, and selects its own work (stories and tasks)
Team discusses acceptance criteria during Sprint planning
Team manages interdependencies and constraints
Progress tacked with burn-down chart
Total Release Planning & Tracking Score (con’t) Score (0-5)
Comments
Work is not added by the product expert during the sprint
Sprints are of a consistent fixed length
Sprints are no more than 4 weeks in length
Team inspects and adapts the Sprint plan
Total Sprint Planning & Tracking Score
Team is completely cross-functional
Team is collocated
Team is 100% dedicated to release
Team is smaller than 15 people
Team works at sustainable pace
Team members complete commitments
Daily stand-up is fully attended
Team leads communication; communication not managed
Team has effective channel for obstacle escalation
Team self-polices and reinforces use of agile practices
Total Team Effectiveness Score Score (0-5)
Comments
All testing is done within Sprint and does not lag behind
Sprint defects are fixed within that Sprint
Unit tests are written before development
Acceptance tests are written before development
Automated acceptance tests
Total Testing Practices Score
Source control used effectively (branching & merging)
Continuous build process automated
Developers integrate code multiple times daily
Team has administrative control over dev environment
Team permitted to refactor code base
Effective code review practices
Coding standards exist and applied
Refactoring is continuous
Pair programming practiced
Identical build for developers environments
Sharing Information
53
• <Discuss SharePoint>– <per team>
• <performance metrics>• <competitive/market analysis & artifacts• <Exploratory
comps/wireframes/descriptions etc>• <backlog>• <release plan>• <weekly sprint progress – via backlog>
Publication
• <outline recurring meetings>– <commitment meeting>– <release planning meeting>– <sprint planning meeting>– <sprint delivery meeting>– <pre-release meeting>– <monthly open-forum area performance
meeting>
Recurring Meetings
• <Define how teams bubble status to exec committee on weekly basis – pre operations meeting>– <Multi-tabbed spreadsheet (ala ru)>
Status
Governance
57
• <characterize appropriate involvement levels>– <examples of *good* involvement>– <examples of *bad* involvement>– <emphasize persuasion & coaxing &
evangelizing as *good*>– <emphasize explicit direction &
imperatives as *bad*>• <discuss pathologies that arise as a
result of this technique>
Stakeholder Involvement
• Iterative Process– People arrive late to meetings– Meetings take too long. People become bored and
devalue meetings– Scrum master dictates design decisions or
micromanages– Teams are too large for daily scrum and sprint
planning– Teams do not report task remaining time for burn
down analysis• People Practices
– Individuals interrupted by non-sprint based work– Team members isolated physically– Team members not accountable for sprint
commitments– Individuals multiplexed across too many teams
Process Impediments
• Product Engineering Practices– Resources necessary for
definition/design/implementation/testing not present on team
– Sprints do not fully implement & test deployable increments of customer valued features
– Product owner not available/integral to team– System integration not forced at each sprint– Product owner wont split up large product backlogs– Features introduced into sprints after sprints begin
• Organizational issues– Software process police regulate to ineffective process– Management assumes fixed cost, fixed scope delivery – Individual rather than team behavior rewarded– Rules or capitalization structures demand water-fall– Teams not collocated to maximum extent feasible
Process Imediments