Adaptive Development Methodology

Post on 11-May-2015

7.790 views 0 download

Tags:

description

Salesforce.com Adaptive Development Methodology deck describing the methodology.

Transcript of Adaptive Development Methodology

Adaptive Development Methodology

Overview Outline

History

ADM Overview

ADM Principles & Mechanics

The Backstory

Fast

Innovative

Successful

Growing

7 years later…

35,000+ customers

900,000+ subscribers

100+ Million transactions per day

200+ in Technology!

But…uh oh…

It’s getting harder…

…to get things done…

…so what’s the deal?

Waterfall process Un-predictable

Delayed releases

Velocity slowdown

No visibility

Late feedback

Technical Debt

Death marchLoss of cred

Over budget

Scope creep

…so what’s the deal?

Waterfall process

Team frustration

…so what’s the deal?

Waterfall process

Team frustration

Not good…

We can do better…

ADMElegant…

…and a little messy

Overview Outline

History

ADM Overview

ADM Principles & Mechanics

Core Values

KISS

Listen to your customers

Iterate

What is ADM?

ADM is a modified Scrum/XP style of product development

that is specific to Salesforce. It employs Scrum project

management framework and adopts certain XP

practices.

What is ADM?

Re-factoring

Self-organizing

Predictable releases

Transparent

Ftest - Selenium

Continuous integration

Debt free

Just-in-timeIterative

Always Potentially Releasable

Time-boxed

User stories

AgileLean

Early feedback

Code Reviews

Collective Code Ownership

Self-correcting

What is Scrum?

An agile project management framework for developing software

Simple

Prioritized work

Time-boxed, 30-day sprints

Self-organized, empowered teams

Daily, verbal communication

Potentially “production quality” every

30 days

What is Scrum?

Eliminates waste

Increases throughput

Provides transparency

What is Scrum?

Overview Outline

History

ADM Overview

ADM Principles & Mechanics

Scrum Lifecycle

Daily Scrum Meeting

Sprint Review: Demo Potentially Releasable New

Functionality

Product Backlog

Sprint Backlog

Retrospective

24 Hours

2 - 4 Weeks

The Scrum Team

QE EngineerDeveloper

Developer

QE Engineer Developer

Tech Writer

UE Designer

Product Owner

Roles: Product Owner

Single throat to choke

Fully accountable for the success or failure of the scrum team

Roles: Product Owner

Owns and prioritizes Product Backlog

Leverages team to break down Product Backlog

Creates Release Backlog by targeting priority Product Backlog

Directly drives development

Fully engaged

Roles: ScrumMaster

Ensures Scrum Team

lives by the principles

and practices of Scrum

Removes obstacles

Coach

Roles: ScrumMaster

Protects team from external

influences

Improves productivity of team

so each user story is

potentially releasable

Keeps progress information

up-to-date and visible to all

Facilitates Daily Meetings

Roles: Scrum Team

Cross-functional team

Has tasks on the Sprint Backlog

Self organizing, Self correcting. Teams decide best way to deliver

Makes their own commitment with the resources available, decides how best to distribute tasks to team members

Members are dedicated resources (as much as possible)

Optimally 6-10 people

Product Backlog

Key to success of Scrum

Master list of functional and non-functional items desired in the product (features, bugs, re-factoring)

Anyone can add to Product Backlog

Product Owner is the only person that prioritizes Product Backlog

Includes relative estimate of size of features (design, code, test, automate, refactor, doc, fix bugs)

Product Backlog Sample

Release Planning

Communicate a common vision for the release

Initial Design

Align team on proposed functionality

Determine target functionality for the release

Release Planning

Groom User Stories small enough

to be effective for sprint planning

Determine the relative size of the

user stories in story points

Determine Release Functionality

based on velocity

Identify Dependencies

Sprint Backlog

Tasks necessary to complete user stories

Many-to-one relationship with user stories

Coding, testing, automation, specs, doc, design, etc.

Sprint Backlog

Team expands items on the Sprint Backlog into specific tasks, time estimates in hours, signs up for ownership

Critical that “The Team” selects items and size for Sprint Backlog

Managed through Scrumforce

Sprint Planning

Determine the Sprint Goal

Determine work necessary to complete the goal (with time estimates)

Make commitments for the Sprint

Sprint Planning Meeting

Team “dog piles” on user stories

Team figures out how to deliver Sprint Goal even without a resource on the team who normally does a particular type of work

Product Owner may negotiate but Team always determines what they can complete during the sprint

The standards by which we define "done" for sprint

functionality is key to the success of iterative, incremental

development. Functionality that meets these standards at

the end of a sprint will be considered potentially release-

able and demoed at the Sprint Review.

Definition of “Done”

User Stories All defined Acceptance Criteria for a user story have been met.

Code Code implementing the user story functionality is checked in and follows department standards. No open regressions (you break it, you own it), with automated tests written for all regressions. No open P1 & P2 bugs for the implemented functionality in the sprint.

Quality Code Coverage of 70% Test plan, cases and execution for sprint functionality, regression and cross functional test

cases related to sprint functionality, need to be 100% executed, and all P1/P2 cases passing. All resolved bugs have been verified and closed for the sprint functionality.

Definition of “Done”

Performance/Scalability Performance/Scalability impact of sprint functionality understood and quantified, and systesting

scheduled, if required, with the sys test team.

User Experience UE reviewed new features or significant changes in the UI, feedback incorporated, all resulting

P1 and P2 UI bugs fixed. Usability testing completed, feedback has been incorporated into the backlog.

Localization All UI components have labels ready for localization vendors.

Documentation User doc describing all aspects of sprint functionality complete / checked in.

Definition of “Done”

Autobuild Page

Daily Standup Meeting: Pigs & Chickens

Two types of people attend Daily Standup: Pigs and Chickens

A chicken and pig were walking down the street. The chicken said to

the pig, "lets open a restaurant." The pig said, "Ok, what should we

name it." The chicken said, "How about "Bacon and Eggs"." The

pig said, "No way … I'd be committed but you would only be

involved."

Daily Standup Meeting

Re-connect, re-commit and share relevant information

Team members answer 3 questions (in 2 minutes):

– What did you do yesterday?

– What will you do today?

– Are there any obstacles in your way?

Daily Standup Meeting

15 minutes or less

All Pigs are required to attend Daily Standup

Pigs talk. Chickens listen.

Not a problem-solving meeting

Obstacles are removed ASAP by the ScrumMaster

Burndown Charts

0.0

5.0

10.0

15.0

20.0

25.0

30.0

35.0

40.0

45.0

50.0

Idea

l Day

s

Actual Baseline

Production support

Resolution of dev assumptions

Added Tasks

Added March Tasks!

Sprint Review

It’s all about feedback and

visibility

All teams demo done

functionality to All Technology /

Stakeholders

Takes place after the last day of

the Sprint

Sprint Review

Only functionality that meets

“Done” criteria is demoed

Team declares what they

committed to doing in the Sprint

and did not get done

Feedback from customers and

stakeholders drives design

changes for future sprints

Sprint ReviewUser Story Doneness Checklist

Done Criteria Handshake

POC

Setup

Page

BT & Profile

Perm

Code checked in and follows department standards.

No open regressions. Automated tests written and reviewed for all

regressions.

No open P1 & P2 bugs

Code Coverage of 70% (or as agreed with team)

100% of test cases logged in QA Tracker and executed in a QA environment,

and all P1/P2 cases passing.

All resolved bugs verified and closed.

Performance/scalability impact ascertained and sys testing scheduled if

required.

UE has reviewed any new features; P1 and P2 UI bugs fixed.

Usability testing scheduled when necessary, and feedback incorporated into

backlog.

All UI labels ready for localization vendors.

User documentation complete and checked in.

Looks at “how” product is built (process, tools, etc.)

Occurs after every Sprint

What went well?

What didn’t go well?

What will you do differently next time?

Retrospective