The rise of Chrome Jonathan Tamary and Dror G. Feitelson PeerJ Computer Science 1:e28, Oct 2015.

Post on 17-Jan-2016

219 views 0 download

Transcript of The rise of Chrome Jonathan Tamary and Dror G. Feitelson PeerJ Computer Science 1:e28, Oct 2015.

The rise of Chrome

Jonathan Tamary and Dror G. FeitelsonPeerJ Computer Science 1:e28, Oct 2015

The Question

• Since 2009 Chrome has come to dominate the desktop browsers market

• How has it done this?• Specifically, is it

technically better than the competition?

• Implication: is its rise different from that of Internet Explorer?

“Technically Better”

• Better performance– Measure using common industry benchmarks– Add measurements of startup time

• Better conformance– Measure using common industry benchmarks

• Better features– Release features earlier– Focus on features that users consider important

The Competition

• Microsoft Internet Explorer 8-11– Previous dominant browser– Bundled with Windows 98 , 76% in 1999– Antitrust case 1997-2002

• Mozilla Firefox 3-26– Branched from original Netscape code– Tried to compete with Explorer– Reached 30% in 2009

• Google Chrome 1-31

PERFORMANCE

Performance Benchmarks

Benchmark Content Response

SunSpider Javascript tasks Time

BrowserMark General browser performance

Score

CanvasMark <canvas> tag Score

PeaceKeeper Javascript tasks Score

Startup times Cold startup time Time

Methodology

• Download and measure all browser versions– Some did not work with some benchmarks

• Perform measurements on a dedicated machine

• Use Win7 32-bit for versions till may 2011, 64-bit for later versions– Results on the two systems were consistent

• Repeat measurements and calculate standard error

Data Collection

Benchmark Rep Missing

Explorer Firefox Chrome

SunSpider 3 30, 31

BrowserMark 3 8 3, 3.5, 3.6 1

CanvasMark 3 8 3 1, 2, 3

PeaceKeeper 3 1, 2, 3, 4

Startup times 20

SunSpider• Developed by WebKit• Measure core Javascript

performance: tasks rather than microbenchmarks

• Debate whether representative

• Explorer improvement attributed to dead-code elimination, perhaps specifically to improve SunSpider performance

BrowserMark 2.0

• Developed by Rightware, originally for mobile and embedded

• Multiple aspects of general browser perofrmance– Page load/resize– WebGL– HTML5, CSS3– Canvas

CanvasMark 2013

• Test HTML5 <canvas> tag: container for graphics drawn with Javascript

• Stress test using– Bitmap operations– Alpha blending– Shadows– Text

• Goal: stay > 30fps

PeaceKeeper

• Developed by FutureMark

• Focus on Javascript use– <canvas>– DOM tree ops– Parsing– Video formats– multithreading

Startup Times

• At boot run script to launch browser 1 min later

• Take timestamp just before launching and pass to browser via URL parameter

• Browser loads page that takes another timestamp and sends to server

• Difference between timestamps is startup time

CONFORMANCE

Conformance Benchmarks

Benchmark Content Response

HTML5 compliance

HTML standard Score

CSS3 test CSS standard Score

Browserscope security

Security enhancing features

Tests passed

HTML5 Compliance

• The language to describe web pages

• HTML5 introduced in 2008 and approved 2014

• Benchmark checks supported features

CSS3 Test

• The language to describe web page style

• Check recognized elements of CSS spec

• Does not check wuality of implementation

Browserscope Security

• Community-driven project to profile prowsers

• Checks support for Javascript APIs for safe interactions

• Score is tests passed

Other Browsers

• We compared the 3 top browsers• There are others– Opera– Safari

• Why did Chrome gain market share while they did not?

• If they are just as good technically, then Chrome’s advantage is only in marketing

Opera

• Main contender in Windows desktop market

• Opera is technically inferior to Chrome

FEATURES

Feature Selection

• Start with 43 features of modern browsers• Remove 11 that were included in Chrome 1

and already existed in Firefox and Explorer– Tabs, history manager, pop-up blocking, …

• Remove 7 that were introduced at about the same time by all 3 browsers– Private browsing, full screen, …

• Use 25 remaining features

Add-ons manager Multiple usersDownload manager AppsAuto-updater Developer toolsCaret navigation Personalized new tabPinned sites Click-to-playSync Print previewSession restore (automatically) Per-site security configurationCrash/security protection Web translationMalware protection Spell checkingOutdated plugin detection Built-in PDF viewerDo not track SandboxingThemes RSS readerExperimental features

Competitive Advantage

• Feature should be released ahead of competition by a meaningful margin

• Our definition: more than one release cycle

• Gives advantage to browsers with slow release (Internet Explorer)

Wins and Losses

• A browser wins if it releases a feature ahead of all the competition

• A browser loses if it releases a feature behind all the competition

• Each feature can have at most one winner and one loser

• 7 features with no winner or loser were removed

Wins and Losses

Wins Losses Chrome 6 5Firefox 5 6Internet Explorer - 13

Feature Importance Survey

• Online survey• 254 participants– HUJI CS facebook page– TAU CS facebook page– Reddit.com/r/SampleSize

• Rank each of 25 features on a 5-point scale1 = least important5 = most importantUse a relative scale (compare features to each other)

Analysis Questions

• Are features that Chrome won more important than those that Firefox won?

• Are features that Chrome lost less important than those that the other two browsers lost?

• Are features that Chrome won more important than those it lost?

• Need to compare scores for sets of features

Statistical Analysis

Conventional approach:• Find average importance grade of features in

each set• Use statistical test for significance

This is wrong!• Scores do not come from a valid interval scale

12 is not necessarily the same as 34• Averaging is meaningless

Statistical Analysis

• Use methodology developed to compare brands

• Brand A is superior to B if distribution of opinions about A dominates distribution of opinions about B in the stochastic order sense

• In plain English: the distribution is skewed toward higher scores (the CDF is lower)

• In our case, “brands” are sets of features(not Microsoft, Google, Mozilla)

Statistical Analysis

• The problem: neither distribution may dominate the other (the CDFs cross)

• Solution procedure:1. Identify homogeneous brands with clustering.

These are brands that cannot be distinguished.2. Find widest collapsed scale. Collapsing unites

adjacent score levels to achieve dominance, but we want to keep as many levels as possible.

3. Verify that resulting dominance is significant.• Due to Yakir & Gilula, 1998

Results for WinsRank Browser Wins Importance scores distribution

1 2 3 4 51 Chrome 6 0.16 0.20 0.24 0.23 0.172 Firefox 5 0.27 0.22 0.23 0.20 0.093 Internet

Explorer0 -- -- -- -- --

Results for LossesRank Browser Losses Importance scores distribution

1 2 3-4 51 Firefox 6 0.17 0.16 0.45 0.22

Internet Explorer

13

2 Chrome 5 0.18 0.16 0.44 0.22

Results for ChromeRank Class Features Importance scores distribution

1-2 3 4 51 Losses 5 0.33 0.18 0.27 0.22

2 Wins 6 0.36 0.24 0.23 0.17

END GAME

Summary of ResultsBenchmark ResultSunSpider Chrome was best through 2010, now

Internet Explorer is significantly better

BrowserMark 2.0 Chrome is best, Explorer worstCanvasMark 2013 Chrome is relatively good but inconsistent,

Firefox worst

PeaceKeeper Chrome is significantly betterStart-up times initially Chrome was better but now Firefox

is better, Explorer has deteriorated

HTML5 Compliance Chrome is better, Explorer worstCSS3 Test Chrome is betterBrowserscope Security

Chrome is better, Firefox worst

Summary of Results

• Crome won on 6 features and lost on 5• Firefox won on 5 features and lost on 6• Internet Explorer lost on 13 features• Chrome’s wins were more important than

Firefox’s wins• The losses of all browsers were equally

important• Chrome’s losses were slightly more important

than its wins

Implications

• Internet explorer is proprietary• Firefox is open source– More innovative and better than product from

leading software firm• Chrome is related to open-source Chromium– But how “open” is it?

• Main factor apparently company/organization and not development style

• Firefox & Chrome also moved to rapid releases– Slow releases could contribute to Explorer’s demise

Threats to Validity

• How to measure market share– netmarketshare.com claims Explorer dominates

• Focus on technical aspects– Ignores marketing campaign and Google brand

• Didn’t check all smaller browsers– If better than Chrome then marketing was decisive

• Used benchmarks do not cover all aspects (and not clear exactly what they do)– But writing new ones suffers no lesser threats

Conclusions

• Chrome’s rise is consistent with technical superiority

• But it is not necessarily the result of technical superiority

• The Google brand name and the marketing campaign may have had significant effect