Objects vs. Images: Choosing the Right GUI Test Tool Architecture
Transcript of Objects vs. Images: Choosing the Right GUI Test Tool Architecture
W15 Session 10/26/2016 15:00
Objects vs. Images: Choosing the Right GUI Test Tool Architecture
Presented by:
Chip Groder
Intervise Consultants Inc.
Brought to you by:
350 Corporate Way, Suite 400, Orange Park, FL 32073 888---268---8770 ·· 904---278---0524 - [email protected] - http://www.starcanada.techwell.com/
Chip Groder Intervise Consultants Inc. Senior test architect Chip Groder has more than thirty years of experience in software development, software testing, and software quality. Chip has helped multiple Fortune 1000 organizations achieve success with test automation by defining test automation architectures; assisting with the creation, selection, and implementation of test automation frameworks; and training and mentoring individuals and teams in the art and science of automating tests. Chip is an expert in keyword-based test design and has experience with dozens of test tools. In his leisure time he is usually in his shop restoring an old John Deere tractor or out riding his horse.
9/26/2016
1
Objects vs ImagesObjects vs. Images Choosing the Right GUI Test Tool
Architecture
Chip Groder, Intervise Consultants [email protected]
1
Context: Test Automation ‘Hierarchy of Needs’
S lScale
Test Maintenance
Test Creation
ToolsTools
The selection of the ‘right’ automation tool is a small part of test automation cost, and the resulting ROI.
2
9/26/2016
2
Selected GUI Test Tool History
• 1988 XRunnerMercury Interactive
• 1984 Apple MacintoshL t 80 U i & XWi d
y
• c 1992 QA Partner Segue Software
• c 1993 MSTestMicrosoft
• c 1994 WinRunnerMercury Interactive
• 1995 SQA Robot SQA Inc
2002 E l t
• Late 80s Unix & XWindows• 1991 World Wide Web• 1991 Linux• 1992 MS Windows 3.1• 1993 Mosaic Browser• 1995 Java AWT• 1997 Java SWING
• 2002 Eggplant Redstone Software
• 2004 Selenium ThoughtWorks > Open Source
• c 2009 SikuliMIT > Open Source
• 2000 QT Project… and so on
3
Object Recognition Test Tools
4
9/26/2016
3
Object Recognition Based Tools
Tool Name Vendor Web Site
Rational Functional Test IBM/Rational http://ibm.com
SilkTest * Microfocus/Borland http://microfocus.com
Unified Functional Test * HP/Mercury http://hp.com
Selenium Open Source http://selenium.org
Ranorex Studio Ranorex http://ranorex.com
Test Complete SmartBear http://smartbear.com
Test Architect * LogiGear http://testarchitect.logigear.com
Jabula Bredex/Open Source http://eclipse.org/jabula
* Product also supports image recognition
5
Object Recognition Test Tools (1)
• Interact with the GUI subsystem to detect and manipulate GUI objects• Web test tools (e.g. Selenium) access the browser’s DOM data
T i ll i t i t th AUT b t t th SUT
Application Under TestTest Tool
System Under TestSystem Under Test
• Typically non‐intrusive to the AUT, but not the SUT• Typically run on the same system as the AUT
Operating System
GUI Subsystem
6
9/26/2016
4
Object Recognition Test Tools (2)• Many have the capability to do ‘remote testing’
• Script on one host machine drives one or more remote machines• Require a (licensed) agent to be installed on each SUT
Test Tool
Script
System(s) Under TestSystem(s) Under Test
Application Under Test
GUI Subsystem
Test Tool(agent) Application Under Test
GUI Subsystem
Test Tool(agent) Application Under Test
GUI S b
Test Tool(agent) Application Under TestTest ToolAgent
pEngine
Operating SystemOperating System
y
Operating System
GUI Subsystem
Operating System
GUI Subsystem
7
Object Recognition Strengths & Weaknesses
• Strengths– They can quickly and easily manipulate standard GUI objects– Your script has access to most or all of the object’s underlying propertiesp j y g p p
• e.g. a button may be enabled, or disabled (grayed out)
– Text values can be accessed directly • e.g. a button caption, window title, or the contents of an edit field
– Execution is often faster than a human could type
• Weaknesses– They are intrusive to the SUT, requiring installation of the tool or an agent– Limited platform support (Windows & maybe Linux)– They don’t understand custom objects
• e.g. the map display in Google Maps• You must extend the functionality of the tool to handle custom objects
– They are typically very weak in image recognition• Pixel‐for‐pixel comparison for regions of the screen
– Most tools provide OCR text recognition as a backup
8
9/26/2016
5
Example: Get the Contents of a List
QTP code:mylist = Application(“Notepad”).Dialog(“Font”).WinComboBox(“Font:”).GetContent
• Returns an array containing the list values
• It doesn’t matter whether the list items are visible
9
Example: Find lake Superior
LakeSuperior.png
• Most tools cannot do this• QTP, SilkTest, and TestArchitect have recently been given recognition capabilities
10
9/26/2016
6
Image Recognition Test Tools
11
Image Recognition Based Tools
Tool Name Vendor Web Site
Eggplant TestPlant Inc http://testplant comEggplant TestPlant Inc http://testplant.com
Sikuli Open Source http://sikuli.org
ATRT Innovative Defense Tech. http://idtus.com
SilkTest * Microfocus/Borland http://microfocus.com
Unified Functional Test * HP/Mercury http://hp.com
Test Architect * LogiGear http://testarchitect.logigear.comTest Architect LogiGear http://testarchitect.logigear.com
* Product primarily supports object recognition
12
9/26/2016
7
Image Recognition Test Tools (1)
• Deal exclusively with what is visible on the screen • Objects are recognized by using a snapshot to find matching image(s) on the screen• Some tools (Eggplant and ATRT) run on a different system than the AUT
System Under TestSystem Under Test
Some tools (Eggplant and ATRT) run on a different system than the AUT• Interact with the AUT via a remote desktop protocol (e.g. VNC or RDP)• Not intrusive to the SUT, since VNC and RDP are bundled with the OS
Application Under Test
RDP
Host SystemHost System
Operating System
GUI Subsystem
VNC or RDP
Test Tool
VNC or R
13
TCP/IP
Image Recognition Test Tools (2)
• Eggplant and ATRT can also support a KVM connection for nonintrusive testing• SUT/AUT must support standard PC KVM connections
Test Tool
System Under TestSystem Under Test
Application Under Test
VNC
Host SystemHost System
Keyboard, Video,
Operating System
GUI Subsystem
& Mouse cables
TCP/IP
KVM/IPappliance
VNC
14
9/26/2016
8
Image Recognition Test Tools (3)
• Sikuli, QTP, SilkTest, and TestArchitect run on the same system as the AUT• With work, Sikuli can be made to run on a separate system
Application Under TestTest Tool
System Under TestSystem Under Test
Operating System
GUI Subsystem
15
Display Subsystem
Image Recognition Strengths & Weaknesses
• Strengths– (Some) are non‐intrusive to the SUT
• Unlimited platform support (if you can connect it, you can test it)
– They see what the human sees. Not constrained to standard objects.• If you can see it, you can manipulate it
– They have sophisticated image recognition capabilities• Capabilities vary widely between tools• Find a previously recorded image on the screen
• Weaknesses– They have no access to the underlying GUI subsystem
• e.g. Getting the contents of a list box requires complex scripting logic• Can only access properties that are visible. If you can’t see it, you’re out of luck.
– Image recognition can require large collections of images– Image recognition can require large collections of images• e.g. Button enabled image, button disabled image
– Text recognition is via OCR only, relatively slow and unreliable– Execution on a different system makes some operations very difficult
• e.g. Reading a log file on the SUT
– Execution can be slow, requires additional work to tune for best performance– Different OS versions may require different image libraries (e.g. Win7 vs Win10)
16
9/26/2016
9
Example: Get the Contents of a List
You have to write code! Example Psuedocode
• “If it isn’t visible, it doesn’t exist”• You must explicitly control what is (or becomes) visible
17
Example: Find lake Superior
LakeSuperior.png
// Eggplant code: Find Lake Superior, print a message.If ImageFound(5.0, “LakeSuperior.png”) Then
put “Found the image at: ” & FoundImageLocation()End If
18
9/26/2016
10
Testing Mobile Apps
• Mobile is the ‘next big thing’• Mobile applications are overtaking PC applicationsMobile applications are overtaking PC applications• Tool vendors are racing to add mobile test capability to
their tools
• Both object‐based and image‐based architectures apply to mobile testing– The test tool generally runs on a separate hostThe test tool generally runs on a separate host– Object‐based architectures rely on native or add‐on software that exposes the GUI subsystem’s objects
– Image‐based architectures use the remote desktop features bundled with the device OS
19
Which Tool Should I Choose?
Is there a requirement to test the AUT through the GUI?
NoConsider non‐GUI tool first.
SUT – System Under TestAUT – Application Under Test
Is there a requirement that the tool be non‐intrusive to the SUT?
Is the AUT browser based (i.e. a web app)?
N
No
Does testing involve verifying large numbers of pictures/images?
Consider image architecture first.
Mobile platforms?
Consider either architecture.
Consider image architecture first.No
Yes
Yes
Yes
Yes
Yes
20
Windows OS only?
Consider object architecture first.
No
No
Consider image architecture first.
Yes
No
9/26/2016
11
THE END
Thank you!
THE END
21