Performance Lab Services proposition

53
SERVICES

Transcript of Performance Lab Services proposition

Page 1: Performance Lab Services proposition

SERVICES

Page 2: Performance Lab Services proposition

2

CONTENTS

EXECUTIVE SUMMARY 3

BUSINESS PROFILE 4

COMPETITIVE ADVANTAGES 5

FUNCTIONAL TESTING 6

Our services 7

Case study. Retail Bank successfully implemented Core Banking System in 6 months 11

PERFORMANCE TESTING 13

Our services 14

Case study 1. Telecom operator supports more than 107 million mobile subscribers 18

Case study 2. The largest retail network fixed delay in EOD procedure 22

AUTOMATED TESTING 26

Project goals 27

Case study 1. Bank decreased acceptance testing time 20x 30

Case study 2. Software development company saved 50% support service time for real–time monitoring 31

USABILITY TESTING 32

Our services 33

Case study. Social network increased site conversion 3x 39

QA CONSULTING 41

Our services 42

SECURITY TESTING 46

Our services 47

Page 3: Performance Lab Services proposition

3

Our officesOur headquarters are located in Santa Clara, CA, USA; we also have off‑shore testing centers in Russia (Moscow, Izhevsk, Tyumen)

Annual turnover

MoscowSanta Clara, California Tyumen

Izhevsk

We have been in the market

since 2008

Our core activity is

software testing and quality assurance

Prize winner of the Leadership Index 2013 employer rating

Manual and automated functional testing of mobile applications and services

Functional, load testing and test automation of 10+ bank systems

Load testing of EMC Documentum

Some of our customers350+employees

2010 2011 2012 2013 2014

$1.5

$4.9

$8.6 $9.2$10.5

DOCUMENTUM

EXECUTIVE SUMMARY

Page 4: Performance Lab Services proposition

4

SOFTWARE TESTING AND QUALITY ASSURANCE

SECURITY TESTING

USABILITY TESTING

PERFORMANCE TESTING

▶Mobile

▶Web

▶ Desktop

▶ IPTV

▶Web services and API

▶Terminals and ATMsFUNCTIONAL

TESTING

AUTOMATED TESTING

QA CONSULTING

PARTNERS

BUSINESS PROFILE

Page 5: Performance Lab Services proposition

5

OUR RATES ARE:

▶ functional tester (manual):

$10+/hour

▶ performance engineer:

$40+/hour

▶ functional tester (automation):

$30+/hour

COMPETITIVE ADVANTAGES

DEEP EXPERTISE INPERFORMANCE

TESTING AND AUTOMATION

TESTING

WIDE USE OF VENDORS TOOLS

(HP, IBM, ORACLE) AND FREE INSTRUMENTS (SELENIUM, JMETER,

TESTLINK)

SELF DEVELOPED TOOLS

FOR TESTING THAT REDUCE

THE SERVICE COST

350+FULL‑TIME TESTING SPECIALISTS,

100+ PROJECTS A YEAR

FULL RANGE OFTESTING SERVICES

CORE ACTIVITY OF PERFORMANCE LAB

IS SOFTWARE TESTING SERVICES

LOCAL TEAMS ANDLOW‑COSTLOCATIONS AROUND THE WORLD

Page 6: Performance Lab Services proposition

6

FUNCTIONAL TESTING

Page 7: Performance Lab Services proposition

7

TYPE OF SERVICES

TOOLS

FUNCTIONAL TESTING: OUR SERVICES

FUNCTIONAL TESTING

REGRESSION TESTING

INTEGRATION TESTING

USER ACCEPTANCE TESTING

DOCUMENTATION TESTING

INSTALLATION TESTING

Page 8: Performance Lab Services proposition

8

FUNCTIONAL TESTING: PROJECT GOALS

REDUCE FINANCIAL AND REPUTATION RISKS RELATED TO DEFECTS

REDUCE EFFORTS SPENT ON TESTING

BY BUSINESS USERS

ENSURE COMPLIANCE SOURCE CODE

VS REQUIREMENTS

CUT DEVELOPMENT AND MAINTENANCE COSTS

AS WELL AS TIME TO MARKET

ENSURE THE OPERABILITY OF THE WHOLE SYSTEM

NOT INDIVIDUAL MODULES

PROJECT GOALS

Page 9: Performance Lab Services proposition

9

ANALYSIS PHASE

ANALYZE BUSINESS PROCESSES

DEVELOPMENT TESTING METHODOLOGY

SET UP TEST MANAGEMENT TOOLS

DEVELOP TEST REQUIREMENTS

DEVELOP TEST CASES EXECUTE FIRST TEST ITERATION

DOCUMENT AND FIX THE DEFECTS

EXECUTE SECOND TEST ITERATION

EXECUTION PHASE

PREPARE TEST REPORTREPORT

FUNCTIONAL TESTING: PROJECT SCOPE

Page 10: Performance Lab Services proposition

10

FUNCTIONAL TESTING: PROJECT RESULT

Test report contents

▶ Number of errorsthat occursduring installation and theirseverity ▶Compliance of source codewithrequirementsdescribed in the specification

▶ Number of defects in new functionality and their severity

▶ Impact of the changes on the system quality

▶Assess the qualityof the technical documentation, it’s relevance, completeness and consistency

▶Operability of business processes passing through several systems or modules

▶Compliance of developed functionality vs business requirements

INSTALLATION TESTING

DOCUMEN‑ TATION

TESTING

UAT INTEGRATION TESTING

FUNCTIONAL TESTING

REGRESSION TESTING

Page 11: Performance Lab Services proposition

11

FUNCTIONAL TESTING: CASE STUDYRETAIL BANK SUCCESSFULLY IMPLEMENTED CORE BANKING SYSTEM IN 6 MONTHS.

CUSTOMER PROFILE

Retail bank Startup. Finance sector. The main line of business ‑ consumer loans.

FUNCTIONAL AND REGRESSION TESTINGMitigate business development and financial risks related with defects

INSTALLATION TESTINGGuarantee compliance source code vs functional specification

INTEGRATION TESTINGEnsure availability all banking systems after integration

PROJECT GOALS

TEAM STRUCTURE

CUSTOMER

PROJECT MANAGER

BUSINESS

TEST LEAD TEST LEAD TEST LEAD

TEST ENGINEER

TEST ENGINEER

TEST ENGINEER

TEST ENGINEER

TEST ENGINEER

TEST ENGINEER

DEVELOPMENT ANALYSIS

Page 12: Performance Lab Services proposition

12

FUNCTIONAL TESTING: CASE STUDY. PROJECT RESULTS

Test report contents

1

18

23

15

Functional testing defects

КритичныйВысокийСреднийНизкий

326

46

27

Regression testing defects1

18

23

15

Functional testing defects

КритичныйВысокийСреднийНизкий

326

46

27

Regression testing defects

Critical High Medium Low

Functional testing defects Regression testing defects Open defects statistic

RETAIL BANK SUCCESSFULLY IMPLEMENTED CORE BANKING SYSTEM IN 6 MONTHS.

▶150 critical defects that can cause system failures and financial

looses were found and fixed during the system

implementation

▶ Implemented Test Management System

and test process regulations

▶ Developed ~950 test cases

& 7 test plans

▶ Implemented installation testing

process for all system updates.

05

10152025

04.0

5.20

1308

.05.

2013

16.0

5.20

1322

.05.

2013

24.0

5.20

1326

.05.

2013

27.0

5.20

1328

.05.

2013

29.0

5.20

1330

.05.

2013

31.0

5.20

1303

.06.

2013

04.0

6.20

1305

.06.

2013

06.0

6.20

1307

.06.

2013

10.0

6.20

1311

.06.

2013

13.0

6.20

1314

.06.

2013

17.0

6.20

1318

.06.

2013

19.0

6.20

1320

.06.

2013

21.0

6.20

1324

.06.

2013

25.0

6.20

1326

.06.

2013

27.0

6.20

1328

.06.

2013

01.0

7.20

1302

.07.

2013

Defe

cts

coun

t

Open defects statistics

низкий

средний

высокий

критичный

05

10152025

04.0

5.20

1308

.05.

2013

16.0

5.20

1322

.05.

2013

24.0

5.20

1326

.05.

2013

27.0

5.20

1328

.05.

2013

29.0

5.20

1330

.05.

2013

31.0

5.20

1303

.06.

2013

04.0

6.20

1305

.06.

2013

06.0

6.20

1307

.06.

2013

10.0

6.20

1311

.06.

2013

13.0

6.20

1314

.06.

2013

17.0

6.20

1318

.06.

2013

19.0

6.20

1320

.06.

2013

21.0

6.20

1324

.06.

2013

25.0

6.20

1326

.06.

2013

27.0

6.20

1328

.06.

2013

01.0

7.20

1302

.07.

2013

Defe

cts

coun

t

Open defects statistics

низкий

средний

высокий

критичный

Page 13: Performance Lab Services proposition

13

PERFORMANCE TESTING

Page 14: Performance Lab Services proposition

14

TYPE OF SERVICES

TOOLS

PERFORMANCE TESTING: OUR SERVICES

PERFORMANCE TESTING

LOAD TESTING

VOLUME TESTING

SYNTHETIC TESTING

RELIABILITY TESTING AND FAILOVER TESTING

Page 15: Performance Lab Services proposition

15

PERFORMANCE TESTING: PROJECT GOALS

REDUCE THE RISK OF SYSTEM FAILURE UNDER LOAD

LIST THE NECESSARY CHANGES IN SYSTEM

ARCHITECTURE OR INFRASTRUCTURE THAT CAN SOLVE

PERFORMANCE PROBLEMS

OPTIMIZE INFRASTRUCTURE COSTS

EXPLORE MAXIMUM PERFORMANCE OF SYSTEM VS

COMPANY BUSINESS FORECAST

PROJECT GOALS

Page 16: Performance Lab Services proposition

16

ANALYSIS PHASE

EXECUTION PHASE

REPORT

PERFORMANCE TESTING: PROJECT SCOPE

ANALYZE PRODUCTION ENVIRONMENT STATISTIC

ANALYZE TEST RESULT

IDENTIFY BOTTLENECKS

DEVELOP RECOMMENDATIONS

PREPARE TEST REPORT

DEFINE BUSINESS PROCESSES AND TEST CASES

IDENTIFY PERFORMANCE ACCEPTANCE CRITERIA

DEVELOP LOAD SCRIPTS, DATA POOLS AND EMULATORS

DEPERSONALIZE DATABASE

SET UP MONITORING TOOLS

CONFIGURE TEST ENVIRONMENTS

EXECUTE TESTS

Page 17: Performance Lab Services proposition

17

PERFORMANCE TESTING: PROJECT RESULTS

▶Maximum count of userswho can work in the system at the same time without crashing or performance degradation ▶ Performance of thenew system version vs the previous one ▶ Performancecharacteristics of the IT system:run‑time user operations,usage of server hardware resources(CPU, Memory, I / O)

▶ How data growth impacts the systems’ performance ▶ System performance during continuous load with large amounts of data

▶ Performance of different hardware configurationas well as itsscalingfeatures ▶Compliance actual performance of the hardware configurationwith vendors commitments

▶ Number of business processes failures after the scrapping different system components ▶ System disaster recovery time and necessary conditions for recovery ▶Changes in the systems performance after system recovery

PERFORMANCE TESTING

AND LOAD TESTING

VOLUME TESTING

SYNTHETIC TESTING

RELIABILITY TESTING AND FAILOVER TESTING

Test report contents

Page 18: Performance Lab Services proposition

18

PERFORMANCE TESTING: CASE STUDY 1 TELECOM OPERATOR SUPPORTS MORE THAN 107 MILLION MOBILE SUBSCRIBERS

CUSTOMER PROFILE

EXCHANGE PROTOCOL DIAGRAM

SYSTEM TECHNICAL PROFILE

High‑loaded and complex system ▶100,000 transactions per second ▶ 7 main systems were load tested ▶ 40+ servers in production environment ▶ Different load instrument (HP LoadRunner, JMeter, Oracle) were used with wide range of protocols (HTTP; SMPP; SOAP; IVR; MSMQ; Oracle 2–tier; Citrix) ▶ Distributed transaction model

Performance testing ▶ Reduce financial and reputation risks related to system performance degradation after new system release or update

PROJECT GOALS

LOADRUNNER C/JAVA

JMETER JAVA

ORACLEPL/SQL

▶ Large Telecom operator in the 6 countries

▶ One of the most higly loaded billing systems in the world

▶ More than 107million mobile subscribers

M.A.R.T.I.

ESPPLISTENER

SUPS

MG

UPRSG

WEB

SOAP

FILES EXCHANGE

FILES EXCHANGE, SOAP, SMPP

MSMQ

SMPP

LOCKINGMODULE

Our customer is the telecommunications industry, offering mobile and fixed voice, broadband, pay TV as well as content and entertainment services in one of the world’s fastest growing regions.

Page 19: Performance Lab Services proposition

19

PERFORMANCE TESTING: CASE STUDY 1SIMPLIFIED BILLING SYSTEM DIAGRAM OF TELECOM OPERATOR

FORIS OSS

SMSC

IVR

Mail-сервер

FAX-сервер

xUSSD

HP IUMSUPS

ЕСПП

MMSC

SPA

VMAIL

StateIN Cash Register

MG

HLRHLRHLR

SMSC

UMRS

MSCP

Doc

MARTI.RDealer

R&D

Stock

Credit Card Gateway

TelBill

Security

RI

Catalogues

SAS

Internet

Report

Устройства массовой

печати

КассыКассыКассы

Файлы SIM-карт

MARTI.Selfcare

Пользователи

UCS

NCC.Export

I-MODE

RBT

IN-Платформа

MARTI

UPRSG

Банк

DMS

DSTKP

DPCFORIS.Roaming

AM

БД НКК

MARTI 50%

UPRSG 50%

MG 120%

SUPS 150%

TelCRM.LockUnlock

Блокировки 8%

ЕСПП Listener ЕСПП Listener 11%

In-Platform

Blocking

ESPP Listener

cashbox

Bank

DB NKK

Mail-server

FAX-server

USB mass printing

SIM-card files

Users

ESPP

Page 20: Performance Lab Services proposition

20

PERFORMANCE TESTING: CASE STUDY 1. PROJECT RESULTTELECOM OPERATOR SUPPORTS MORE THAN 107 MILLION MOBILE SUBSCRIBERS

▶ 20 critical defects that can cause system failures under load were found and

fixed during the regular performance testing

▶ Implemented regular performance testing process

▶Created control point with performance characteristics

of the IT system: run‑time user operations, server

hardware resources (CPU, Memory, I / O)

Performance Lab

test methodology

READY TO INSTALL

CRITICAL ERRORS

Test Report.Release.4.6.2.1

Test Report.Release.4.6.2.2

Test Report.Release.4.6.2.3

Test Report.Release.N

RELEASE 4.6.1

RELEASE 4.6.2

RELEASE 4.6.3

RELEASE N

Page 21: Performance Lab Services proposition

21

PERFORMANCE TESTING: CASE STUDY 2 THE LARGEST RETAIL NETWORK FIXED DELAY IN EOD PROCEDURE

Compare performance of different hardware configurations and select optimal system‑specific configuration.

BWP ‑ DS8300

BW HANA

BW EXADATA

BWP IBM FLASHSTORAGE

These problems were business ‑ critical ▶ Insufficient system performance leads to delays in the calculation and reporting. ▶ End of Day procedure takes more than 8h, that means malformation inventory inbalance.

CUSTOMER PROFILE

CUSTOMER PROBLEMS

PROJECT GOALS

Large retail network.Retail Network runs 106 hypermarkets and 23 supermarkets in 60 сites and has 6 million active clients.27800 employees.

Page 22: Performance Lab Services proposition

22

WEB SERVICE

BEX

OLAP PROCESSOR

JMETER

SAP BW

ORACLE SAP HANA

EXADATAORACLE + FLASH

STORAGE

▶ SAP BW is used for financial accounting, analytics and logistics.

▶ Active user counts 490, logged user counts 1000.

▶The average number of weekly generated reports is 48904

SYSTEM TECHNICAL PROFILE

PERFORMANCE TESTING: CASE STUDY 2 THE LARGEST RETAIL NETWORK FIXED DELAY IN EOD PROCEDURE

Page 23: Performance Lab Services proposition

23

PERFORMANCE TESTING: CASE STUDY 2. PROJECT RESULT. PART 1THE LARGEST RETAIL NETWORK FIXED DELAY IN EOD PROCEDURE

▶As a result of the test, BW HANA and BW EXADATA were chosen as optimal

performance configurations with the best scalability

and headroom performance

▶These configurations showed the best performance results, when various

reports and EOD procedure were run

Average reporting time, %EOD execution time, %

OOS and ABC rating calculation

Reserve availability calculation

End of Month StProv procedure calculation

End of Month Bonuses Allocation calculation

Average client processing time

Average transaction costs time

Average application server time

Average database server time

Page 24: Performance Lab Services proposition

24

▶The main performance constraint of BWP and BWP

FS configurations is database server performance

▶ CPU performance is the bottleneck

of database servers

▶ Application server performance, as well as hardware resources

utilization, does not depend on the database configuration

AVERAGE APPLICATION SERVERS CPU LOAD AVERAGE DATABASE SERVERS CPU LOAD

Test duration, min Test duration, min

PERFORMANCE TESTING: CASE STUDY 2. PROJECT RESULT. PART 2THE LARGEST RETAIL NETWORK FIXED DELAY IN EOD PROCEDURE

Page 25: Performance Lab Services proposition

25

AUTOMATED TESTING

Page 26: Performance Lab Services proposition

26

CUT TESTING ITERATION COST REDUCE TESTING TIME

IMPROVE BASE SYSTEM QUALITY AND MINIMIZE

HUMAN FACTORS IMPACT

GET TEST REPORT QUICKLY AND AUTOMATICALLY

ABILITY TO TEST DURING

OFF‑HOURS

INCREASE THE TRANSPARENCY

AND ACCURACY OF SCHEDULING

PROJECT GOALS

AUTOMATED TESTING: PROJECT GOALS

Page 27: Performance Lab Services proposition

27

COMMERCIAL

AUTOMATED TESTING: TOOLS

DATABASEWEB PORTAL JENKINS

JAVA LIBRARIES

TEST SCENARIO

(JAVA, SELENIUM)

OUR FREE AUTOMATION TOOLS BASED ON:

Page 28: Performance Lab Services proposition

28

AUTOMATED TESTING: PROJECT SCOPE

ANALYZE BUSINESS PROCESSES

SELECT THE TOOL FOR AUTOMATED TESTING

SUPPORT AND IMPROVE AUTOMATED TESTING SYSTEM

INTEGRATE THE SOLUTION INTO THE DEVELOPMENT ENVIRONMENT

UPDATE AUTOMATED TESTS AND FRAMEWORK BY CUSTOMERS REQUEST

PLAN EDUCATION OF CUSTOMER SPECIALISTS

DEVELOP THE INSTRUCTIONS AND SUPPORT DOCUMENTATION

DEMONSTRATION AND TRANSFER SOLUTION TO CUSTOMER

ANALYSIS PHASE

EXECUTION PHASE

SUPPORT *

TRANSFER

* Autotest support is additional option

SETUP WORKPLACES AND INFRASTRUCTURE

DEVELOP AUTOMATED TESTING METHODOLOGY

DEVELOP DESIGN ARCHITECTURE SCHEME

DEVELOP FRAMEWORK

DEVELOP AUTOMATED TESTS

EXECUTE AUTOMATED TESTS AND SEND REPORT

Page 29: Performance Lab Services proposition

29

PROJECT RESULTS

AUTOMATION TESTING: CASE STUDY 1. BANK DECREASED ACCEPTANCE TESTING TIME 20X

▶ Bank with offices in 2000 cities

▶ 30.7 million clients

▶~ 9500 offices, 1500 ATMs

▶~96 000 sales points in retail partners.

CUSTOMER PROFILE

SYSTEM TECHNICAL PROFILE

PROJECT GOALS

Credit system has four interfaces: ▶Web

▶ .Net

▶ SharePoint

▶VB6

▶ Reduce the acceptance testing time of new system releases and updates by implementating automated testing for critical business processes.

ACCEPTANCE TESTING TIME REDUCED TO 2 HOURS

TEST DEVELOPMENT TIMES REDUCEDAS FRAMEWORK WAS DEVELOPED

CREATED 341 AUTOTESTS

CONFIGURATION OF AUTOMATED TEST SYSTEM TAKES LESS THAN 1 MINUTE

TEST COMPLETE WAS INTEGRATED WITH SOAP UI FOR WORKING WITH OSB

+

REDUCED 20X

CRITICAL BUSINESS PROCESSES TEST COVERAGE 77%

EFFORTS REDUCED 3X

60 SEC

Page 30: Performance Lab Services proposition

30

Large software development company

Develop a solution that provides real‑time information about the system functioning and its services health

CUSTOMER PROFILE PROJECT RESULTS

PROJECT GOALSDELIVERED RESULTS

MONITORING SYSTEM

SOLUTION FUNCTIONALITY

▶ Monitors services availability and operability

▶ Checks information validity

▶ Provides status report on a schedule or on demand.

▶ Shows latest status of online services and statistics

QUICK RESULT

STATISTICS

3360 AUTOMATED TESTS FOR 40 SERVICES IN 6 MONTHS

AVAILABLE 24/7 VIA INTERNET

MONITORING SYSTEM SEND EMAIL WITH TEST RESULTS

UPLOADS TEST RESULT STATISTICS INTO EXCEL BY USER’S REQUEST

AUTOMATION TESTING: CASE STUDY 2. SOFTWARE DEVELOPMENT COMPANY SAVED 50% SUPPORT SERVICE TIME FOR REAL–TIME MONITORING

Page 31: Performance Lab Services proposition

31

USABILITY TESTING

Page 32: Performance Lab Services proposition

32

TYPE OF SERVICES

TOOLS

USABILITY TESTING: OUR SERVICES

USABILITY TESTING

USABILITY AUDIT

Page 33: Performance Lab Services proposition

33

MAKING RECOMMENDATIONS OF HOW TO IMPROVE SYSTEM

INTERFACE USABILITY

INCREASE REVENUESBY HIGHER WEB‑SITE

CONVERSION RATE

INCREASE REVENUES BY INCREASING CUSTOMER/GUEST CONVERSION RATE

REDUCE OPERATIONAL COSTS FOR CLIENT CALL‑CENTERS

AND OFFLINE OFFICES

PROJECT GOALS

USABILITY TESTING: PROJECT GOALS

Page 34: Performance Lab Services proposition

34

USABILITY TESTING: PROJECT SCOPE

ANALYZE BUSINESS PROCESSES

DEVELOP RECOMMENDATION TO MEET USABILITY REQUIREMENTS

SELECT TESTING METHOD

REDESIGN SCREENS TO FIX USABILITY PROBLEMS

IDENTIFY TARGET AUDIENCE IN ACCORDANCE WITH THE SPECIFIC CLIENTS’ CRITERIA

PREPARE FINAL REPORT

RECRUIT TARGET AUDIENCE

DEVELOP TEST SCENARIO

PERFORM USER TESTS

DEFINE PASS CRITERIA FOR THE TASKS

ANALYZE EACH USABILITY ISSUE AND THEIR SEVERITY

ANALYSIS PHASE

EXECUTION PHASE

REPORT

Page 35: Performance Lab Services proposition

35

Test report containsinformation

USABILITY TESTING: PROJECT RESULT

Test report contents

EFFECTIVENESS EFFICIENCY SATISFACTION

Users’ EFFECTIVENESS in performing tasks

User ERRORS with their SEVERITIES and REASONS

Users’ EFFICIENCY (speed) in performing tasks

USABILITY RECOMMENDATIONS

Level of users’ SATISFACTION

Selected SCREENS PROTOTYPES

Page 36: Performance Lab Services proposition

36

USABILITY AUDIT: PROJECT SCOPE

ANALYZE BUSINESS PROCESSES

DEVELOP RECOMMENDATIONS TO MEET USABILITY REQUIREMENTS

REDESIGN SCREENS TO FIX USABILITY PROBLEMS

CREATE FINAL REPORT

ANALYZE SYSTEM INTERFACE CONFORMANCE TO INTERNATIONAL USABILITY STANDARDS (ISO, GOOGLE GUIDELINES, ETC.)

ANALYZE FOUND USABILITY ISSUES

ANALYSIS PHASE

EXECUTION PHASE

REPORT

Page 37: Performance Lab Services proposition

37

USABILITY AUDIT: PROJECT RESULT

BEFORE USABILITY EXPERTISE AFTER USABILITY EXPERTISE

Test report contents

INTERFACE PROBLEMS with their SEVERITIES

USABILITY RECOMMENDATIONS

Interface,patterns,interfaceoutlookdoesn’tmatchGoogle mobile apps guidelines:

Photo – toosmall,usercan’tformatruenotionofaplacebythisphoto.

Icons –somecanbeclickedon,some–сфтnot.

Text – containsgrammarerrors.

Selected SCREENS PROTOTYPES

Page 38: Performance Lab Services proposition

38

USABILITY TESTING: CASE STUDY 1SOCIAL NETWORK INCREASED SITE CONVERSION IN 3 TIMES

CUSTOMER PROFILE

Social network. Media sector.The system allows users create goals and track achievements.

CUSTOMER PROBLEM

Visitors leave the site main page without registration.

PROJECT GOALS

Encourage new visitors to join the network by redesign of main page

Develop recommendations to improve main page usability

Page 39: Performance Lab Services proposition

39

BEFORE USABILITY EXPERTISE AFTER USABILITY EXPERTISE

USABILITY TESTING: CASE STUDY 1. PROJECT RESULTSMARTPROGRESS INCREASED SITE CONVERSION 3X

How we did it?

USER TASK WAS: «Open the main page and understand service idea».

Usability recommendations were implemented and main site page was redesigned

Site conversion was increased in 3 times, visit depth was increased by 30‑40%

USABILITY TESTING SHOWS SEVERAL PROBLEMS AT THE MAIN PAGE:1. User likes background image, but they think that site is scientific;2. User tried to register but they weren’t success: ▶ the registration was on second page but users don’t now about it

▶ bright background draw users away from join button

Page 40: Performance Lab Services proposition

40

QA CONSULTING

Page 41: Performance Lab Services proposition

41

QA CONSULTING: OUR SERVICES

PROCESS IMPROVEMENT TECHNOLOGY IMPROVEMENT

TEST PROCESS IMPROVEMENT

TEST CENTER OF EXCELLENCE ORGANIZATION

RELEASE MANAGEMENT

PROCESS OPTIMIZATION

REQUIREMENTS MANAGEMENT

PROCESS ORGANIZATION

CONTINUOUS INTEGRATION

SYSTEM IMPLEMENTATION

STATIC CODE ANALYSIS SYSTEM

IMPLEMENTATION

TEST MANAGEMENT

SYSTEM IMPLEMENTATION

TOOLS

Page 42: Performance Lab Services proposition

42

QA CONSULTING: PROJECT GOALS

MINIMIZE PRODUCTION AND SUPPORT COSTS

IMPROVE SOFTWARE QUALITY

MINIMIZE TIME TO MARKET INDICATOR FOR

NEW FUNCTIONALITY

STANDARDIZE SOFTWARE TESTING PROCESS

PROJECT GOALS

Page 43: Performance Lab Services proposition

43

PROCESS IMPROVEMENT

Test Process Improvement

Requirements management

process

TCoE organization

Release management process optimization

▶Maximize ROI from software testing through consolidation and standardization

▶ Ensurecompliance ofdesigned systemwith customerexpectations ▶ Find and fix potential issues at thestage of design and requirements analysis ▶Assessmentof resources, schedule and cost at earlier stages of the project

▶ Increase interactionefficiency of different customer units in software release process

▶Transparent planning of testingbudget ▶ Best QA practices ▶Minimal participation of business‑units innon‑core processes ▶Accurate forecasting dateswhen new products come to the market

QA CONSULTING: PROJECT RESULT

Page 44: Performance Lab Services proposition

44

TECHNOLOGY IMPROVEMENT

QA CONSULTING: PROJECT RESULT

Continuous Integration

system implementation

Static code analysis system implementation

Test Management system

implementation

▶ Established process of continuous system integration ▶ Finalized and documented systemdelivered to customer ▶Create quality gatesfor each system release

▶ Reduce defectdetection time ▶ Reduce defect numbers duringintegration

▶ Improve transparency and manageability of thetesting process ▶Cut the cost and time for testingdue to reusability of testing artifacts and reducing duplicative or unnecessary work

Team Version control

Continuous Integration

Quality gates Approval Production

Page 45: Performance Lab Services proposition

45

SECURITY TESTING

Page 46: Performance Lab Services proposition

46

TYPE OF SERVICES

PENETRATION TESTING SECURITY AUDIT

TOOLS

SECURITY TESTING: OUR SERVICES

INTERNAL PENETRATION

TEST

EXTERNAL PENETRATION

TEST

NETWORK SECURITY ANALYSIS

INFORMATION SECURITY

RISK ANALYSIS

Page 47: Performance Lab Services proposition

47

EXTERNAL PENETRATION TEST

PROJECT GOALS

Get unauthorized access to the IT system using technical vulnerabilities and social engineering techniques

PROJECT RESULTS

▶ Report about information systems vulnerabilities

▶ Recommendations to improve security level of information systems

▶ Complience with current security policy

1. ANALYSIS PHASE ▶ Collect information on Customer using search engines, registration services (DNS, Whois and etc.) and others public information source

▶ Collect information of public available network recourses (network services, operating systems and applications)

▶ Identify critical data storage or processing areas, that are accessible externally

▶ Collect information on the customer’s employees

3. REPORT ▶ Create report and develop recommendations

2. EXECUTION PHASE ▶ Search common and specific exploits in web‑applications (OWASP TOP10 and etc.)

▶ Determine the external vulnerability of the network perimeter

▶ Develop penetration attack vectors and methods.

▶Try to hack using collected information

PROJECT SCOPE

Page 48: Performance Lab Services proposition

48

INTERNAL PENETRATION TEST

PROJECT GOALS

Get unauthorized access to the IT system using the technical vulnerabilities and social engineering techniques.

PROJECT RESULTS

▶ Report about information systems vulnerabilities

▶ Recommendations to improve security levels of information systems

1. ANALYSIS PHASE ▶ Collect information of available recourses from user’s segment of local network (network services, operating systems and applications)

▶ Identify critical data storage or processing areas ▶ Identify vulnerable resources that could lead to the feasibility of unauthorized actions on them

▶ Identify resources vulnerability that leads to unauthorized actions on them ▶ Develop vectors and methods of penetration, that can obtain unauthorized access to critical data

3. REPORT ▶ Create report and develop recommendations

2. EXECUTION PHASE ▶ Attack using collected information

▶ try to get accounts and passwords using interception of network traffic

▶ try to get unauthorized access to servers, databases, users’ computers using incorrect settings or vulnerabilities

PROJECT SCOPE

Page 49: Performance Lab Services proposition

49

INFORMATION SECURITY RISK ANALYSIS

PROJECT GOALS

Create long‑term security policy for organization based on current information security threats and risks, company assets in terms of their importance.

PROJECT RESULTS

Risk mitigation plan that allows to manage all potential security risks

1. ANALYSIS PHASE ▶ Inventory information assets and estimate their cost

▶ Choose risk assessment methodology for particular organization

3. REPORT ▶ Develop risk assessment report

▶ Develop risk mitigation plan

2. EXECUTION PHASE ▶ Identify vulnerabilities and potential threats

▶ Develop a risk registry

▶ Assess information security risk (qualitative and quantitative estimation)

▶ Analyze information security measures

PROJECT SCOPE

Page 50: Performance Lab Services proposition

50

NETWORK SECURITY ANALYSIS

PROJECT GOALS

Reduce financial and reputation risks related with low level of network security ▶ Assess information security network infrastructure level ▶ Develop recommendations to improve level of network infrastructure security using best practice

PROJECT RESULTS

▶ Detailed report based on analysis of network security ▶ Recommendations for network infrastructure optimization in terms of information security ▶ Options of technical solutions for network upgrades ▶Technical specification for network modernization

1. ANALYSIS PHASE ▶ Inventory IT – infrastructure

▶ Create network diagram (physical and logical)

▶ Define relationship of logical and physical network levels

3. REPORT ▶ Create security analysis report and develop recommendations.

▶ Create technical solutions and specifications

2. EXECUTION PHASE ▶ Analyze network security

▶ Analyze wireless Infrastructure security

PROJECT SCOPE

Page 51: Performance Lab Services proposition

51

SECURITY TESTING: EXTERNAL PENETRATION TEST. CASE STUDY 1 MAJOR INSURANCE COMPANY HAS PREVENTED THE POSSIBLE LEAKAGE OF PERSONAL DATA OF CUSTOMERS

CUSTOMER PROFILE RESULTS

SYSTEM TECHNICAL PROFILE

Личный кабинет клиента компании расположен на web портале и предназначен для: ▶Управления подключенными услугами ▶Оплаты и подключения новых услуг и расширения существующих.

▶ Identified 120+ potential vulnerabilities including 9 of those highly critical ▶ A fishing attack was successfully implemented that resulted in obtaining the login and password ▶ Privileged access to the database with personal data of users personal account, including name, address and mobile phone number was gained.PROJECT GOALS

Large insurance company ranked TOP 20 in 150+ mln. population country

Evaluate the possibility of unauthorized access to data through a personal account of a web portal

Page 52: Performance Lab Services proposition

52

SECURITY TESTING: EXTERNAL PENETRATION TEST. CASE STUDY 2 AN INDEPENDENT EVALUATION OF THE SECURITY LEVEL OF THE SYSTEM BEFORE RUNNING INTO PRODUCTION

CUSTOMER PROFILE RESULTS

SYSTEM TECHNICAL PROFILE

▶Web портал предназначен для обращений граждан к власти города.

▶ Identified vulnerability which allows the use of arbitrary replacement of the content of the page. ▶ Identified an opportunity for an unimpeded guess of passwords to log in. ▶ A number of XSS vulnerabilities allows to transmit the data entered by the user to the third party server ▶ CSRF vulnerability allowed to perform actions on behalf of the attacked user

PROJECT GOALS

IT Department of a governmental agency with the overall budget for software support $1bln in 2014 is about to run a new system for interaction between citizens and authorities.

▶ Identify the vulnerabilities of information system before its running in production ▶ Evaluate the possibility of data substitution by internal users

login

*************

*************

password

Page 53: Performance Lab Services proposition

Headquarters4633OldIronsidesDrive,SantaClara,California,95054,USA

Phone:+18559807587

www.performance-lab.com

[email protected]

Off‑shore testing center511,6-5Barclayastr.,Moscow,Russia,121087

Phone:+74959896165

www.performance-lab.ru

[email protected]

Max KutuzovManagingpartner

Cell:+79099041111

[email protected]

THANK YOU !

Performance Lab US