TopCoder Crowdsourcing

39
10 BURNING QUESTIONS ON CROWDSOURCING: YOUR STARTING GUIDE TO OPEN INNOVATION AND CROWDSOURCING SUCCESS. ? ? ? ? ? ? ? ? ? ? ? ? ? ? ?

description

TopCoder Crowdsourcing

Transcript of TopCoder Crowdsourcing

Page 1: TopCoder Crowdsourcing

10 BURNING QUESTIONS ON CROWDSOURCING:

YOUR STARTING GUIDE TOOPEN INNOVATION AND CROWDSOURCING SUCCESS.

?

?

?? ?

?

?

?

?

?

?? ?

?

?

Page 2: TopCoder Crowdsourcing

THIS BOOK IS DEDICATED TO THE TOPCODER COMMUNITY. YOU NEVER CEASE TO AMAZE US.

EVERY GRAPHIC, ICON, COPY FORMATTING, AND PAGE DESIGN

YOU ARE ABOUT TO EXPERIENCE IN THIS BOOK, AND CONVERSION WIDGETS

AND BANNERS SEEN ON TOPCODER.COM WERE CREATED BY THE TOPCODER COMMUNITY

VIA OPEN COMPETITIONS. DURING THE PROCESS OF WRITING, EDITING AND PREPARING THIS

BOOK FOR PUBLIC CONSUMPTION, WE LEVERAGED TWO EXCEPTIONAL COPILOTS AND

ONE WITH A FOCUS ON DESIGN ELEMENTS AND ONE WITH A FOCUS ON WEBSITE

IMPLEMENTATION AND CONVERSION. THIS BOOK IS DEDICATED TO ALL THE AMAZING

AND TALENTED INDIVIDUALS WHO HELPED BRING THIS BOOK TO LIFE.

PAGE0110 BURNING QUESTIONS ON CROWDSOURCING

Page 3: TopCoder Crowdsourcing

INTRODUCTION

So you and your enterprise want to use Crowdsourcing and Open Innovation to accelerateinnovation cycles and increase productivity - great, we don't blame you! Open Innovationplatforms implementing Crowdsourcing techniques are proving time and again to be anexceptional way to produce amazing assets and solve complex problems. In short, this is anexceptional way to innovate, but like all else in business and in life, you will have more success if you are prepared and understand what you are after.

Perhaps you have read a case study or two, or otherwise heard about the successes and promises of Open Innovation and Crowdsourcing and it piqued you, so that you now find yourself downloading and reading this eBook. Well, we thank you, welcome you, and commend you for taking this next step. This starting guide is about your path, your understanding, and your ultimate success in Open Innovation and Crowdsourcing.

WELCOME, JUMP RIGHT IN!

??

???

? ?

?

?

? ?

?

?

?

?

PAGE0210 BURNING QUESTIONS ON CROWDSOURCING

Page 4: TopCoder Crowdsourcing

LOGISTICS - HOW DOES CROWDSOURCING LOGISTICALLY WORK AND WHAT MUST IDO TO GET STARTED?

EXECUTION - HOW DO I PRODUCE INOPEN INNOVATION AND CROWDSOURCING?

GROWTH – HOW DO I SCALE OPENINNOVATION AND TRANSFORM YOURENTERPRISE?

???

PAGE0310 BURNING QUESTIONS ON CROWDSOURCING

THIS STARTING GUIDE IS BROKEN DOWN INTO 3 PARTS, WITH 10 CHAPTERS IN TOTAL, ONE CHAPTER DEDICATED TO EACH BURNING QUESTION. THE 3 PARTS ARE:

PLEASE ENJOY THE 10 BURNING QUESTIONS ON CROWDSOURCING AND BEST OF LUCK ON YOUR CONTINUING JOURNEY TO PERSONALAND ENTERPRISE SUCCESS!

WHAT YOU ARE ABOUT TO ENJOY

THE STARTING GUIDE BREAKDOWN

INTRODUCTION

In total, this starting guidebook will answer 10 burning questions on Open Innovation and Crowdsourcing that

every person (and enterprise) venturing into this world of massively parallel production should understand before

they get going. Each answer will explain the importance of the question – why it matters to you– and the specific

way in which that question is answered for the TopCoder Enterprise Open Innovation (EOI) platform.

The world has shifted. There is a global talent pool with exceptional and specific skill-sets waiting for their

opportunity to produce for you. Through Open Innovation and Crowdsourcing, you can innovate and deliver like

never before and we hope this starting guide is your launching pad to amazing success.

Page 5: TopCoder Crowdsourcing

TABLE OF CONTENTS

PAGE0410 BURNING QUESTIONS ON CROWDSOURCING

How do I decide how much money to offer “the crowd”?

How long should my contests run for?

How do I decide and pick the winners?

How do I accomplish complex as well as simple tasks?

How do we protect Intellectual Property?

How can we truly scale Open Innovation?

How do I prepare a project for success?

How do I ensure the right talent is working on my project or build?

How does iteration and subject matter expertise work with "the crowd"?

How do I manage my Open Innovation & Crowdsourced competitions?

PART 1: LOGISTICS - HOW DOES CROWDSOURCINGLOGISTICALLY WORK AND WHAT MUST I DO TO GET STARTED?

PAGE05

PART 3: GROWTH – HOW DO I SCALEOPEN INNOVATION AND TRANSFORM YOUR ENTERPRISE?

PAGE26

PART 2: EXECUTION - HOW DO I PRODUCE INOPEN INNOVATION AND CROWDSOURCING?

PAGE13

Page 6: TopCoder Crowdsourcing

PART 1 - LOGISTICSHOW DOES CROWDSOURCINGLOGISTICALLY WORK AND WHAT MUSTI DO TO GET STARTED?

Yes, it can be that easy. Before we dive into any and all of the forthcoming Q&A please realize one over-arching

point: The platform and community/crowd you choose to engage makes all the difference.

Seek a platform that is truly focused on your outputs and your innovations; one that doesn’t ask you to get

bogged down in the minutia, but instead allows you to focus on your productivity and your value creation.

The community of solvers, producers, and contributors you engage with Open Innovation and Crowdsourcing

competitions should be both wide and “deep”, meaning you should be able to accomplish a wide variety of

work via your community, while accessing hyper-specialized talent with very specific skill-sets that compete

for you in each unique competition, contest, or challenge.

TopCoder focuses on digital asset creation through competitions and we canvass an array of specific competitions in that space. Other communities that leverage Crowdsourcing focus on things such as video production, localization, and physical mobile applica-tion testing on specific handsets. Understand your needs, do your homework, and then choose wisely. Regardless of which communi-ties you work with in the future, the following content will help you prepare for repeatable success.

Let’s dive into the questions.

You host a competition and you get the winning solution, idea, or digital asset… What’sso hard about that!?

PAGE0510 BURNING QUESTIONS ON CROWDSOURCING

Page 7: TopCoder Crowdsourcing

PAGE0610 BURNING QUESTIONS ON CROWDSOURCING

HOW DO I DECIDE HOW MUCH MONEY TO OFFER ‘THE CROWD’?

BURNING

QUESTION #1

Why this question matters to you:

Crowdsourcing and Open Innovation platforms are an evolved

marketplace and economic factors are in play, meaning you need to

correctly price tasks, work, and challenges in order to draw-out the

best talent from any given community so that they will work on your

projects. One of the major benefits of Crowdsourcing is the fact that

it is mainly meritocratic, meaning the lion’s share of the prizes goes

to the winning contributors and solvers and you pay for the best

outputs, not effort. So you have to make sure you’re offering a fair

market value for the “lift” you are expecting of the individuals within

the community. Remember, they – the community members

creating value for you - are taking on risk and very often have no

guarantee of payment, so pricing a contest fairly is crucially

important to overall success. At the same time, you need to work

within your budget. Different Crowdsourcing and Open Innovation

communities have an incredible variety of payments being offered

spanning from literal pennies for the completion of a “micro-task” to

prizes of over a million dollars being offered for solving really large,

complex problems and algorithmic challenges.

PART 1 - LOGISTICSHOW DOES CROWDSOURCINGLOGISTICALLY WORK AND WHAT MUSTI DO TO GET STARTED?

?

?

? ?

$

$$

?

Page 8: TopCoder Crowdsourcing

You shouldn’t. This is part of the “minutia of Crowdsourcing” we discussed earlier and it should not be a

focus of yours. Of course you have budget, but setting prize monies for specific challenges should be based

on historical success of similar challenges on whichever platform you are using. You should be working on a

platform that offers you clear and transparent visibility into the market price for each specific type of task or

challenge.

TopCoder has been running atomized challenges and contests for more than a decade. Our metrics (based

on thousands & thousands of successful competitions), user dashboards with transparent access to current

market analytics, and Platform Specialists (a client resource that helps our clients get the very most out of

the TopCoder Community and Platform) suggest the right level of pricing for each of your specific

competitions. We know what pricing level works best and for what type of competition. You of course have

final say and must green-light any given project or competition, but whether it be something “small” like an

icon design contest or something quite “large” such as a Big Data long-form algorithmic challenge, the

TopCoder Platform history will guide you.

On our platform, your focus needs to stay on the results and the value you can add by helping the communi-

ty create for you, not lost in the weeds of deciding what prize monies will be most effective for your

particular contest. Stay focused on your success, you will hear this theme a few more times throughout this

guide.

PART 1 - LOGISTICSHOW DOES CROWDSOURCINGLOGISTICALLY WORK AND WHAT MUSTI DO TO GET STARTED?

PAGE0710 BURNING QUESTIONS ON CROWDSOURCING

SO HOW DO YOU DECIDE HOW MUCH MONEY TO OFFER ‘THECROWD’ FOR EACH SPECIFIC COMPETITION OR CHALLENGE?

?

? ?

ANSWER:

Specifically on the TopCoder EOI Platform:

Page 9: TopCoder Crowdsourcing

PAGE0810 BURNING QUESTIONS ON CROWDSOURCING

Why this question matters to you?

Open Innovation and Crowdsourcing isn’t about running one

stand-alone challenge and being done with “it”. Instead, it’s

about transforming how you approach the very notion of work,

how you access global talent, and ultimately, how you produce,

innovate, and succeed like never before. To achieve this

long-term success, it means making Open Innovation part of your

daily routine, and for that to happen, you have to have an

understanding of how long things will take to get done and

delivered back to you. You have deadlines, people to answer to,

and projects that have to cross the finish-line on time.

From the standpoint of the community or crowd you are working

with, contests that are either too short or too long in duration

hinder the maximum participation that will drive the best results.

SO IN SHORT, KNOWING HOW LONGYOUR CONTESTS NEED TO RUN FORIN ORDER TO BE SUCCESSFULLYCOMPLETED BECOMES A CRUCIALPIECE OF YOUR SUCCESS INCROWDSOURCING AND OPENINNOVATION.

PART 1 - LOGISTICSHOW DOES CROWDSOURCINGLOGISTICALLY WORK AND WHAT MUSTI DO TO GET STARTED?

HOW LONG SHOULD MY CONTESTS RUN FOR?

BURNING

QUESTION #2

?

? ?

?

?

?

?

?

?

Page 10: TopCoder Crowdsourcing

??? ?

PART 1 - LOGISTICSHOW DOES CROWDSOURCINGLOGISTICALLY WORK AND WHAT MUSTI DO TO GET STARTED?

It greatly depends on the type of work you are putting out to a community and crowd, and much like the first

question; you should be seeking guidance from the operators of the platform. You shouldn’t be worrying

about how long a particular contest is going to run for. Instead, your focus should be on the entirety of the

project, and orchestrating multitudes of projects in unison to vastly increase your personal productivity via

Open Innovation and Crowdsourcing. In short, it’s not about ‘how long’, it’s about how predictable and

repeatable a certain type of contest is, so you can effectively set a contest up for success, “walk-away” from

it while the competition is on-going and doesn’t require your active involvement, and in the meantime, work

on other projects, pieces of projects, and yes, even run more competitions to really force-multiply your

efforts.

The very vast majority of competitions on the TopCoder Platform – with the exception of Bug Races that are

a “first to fix” type contest – are strategically time-boxed, meaning they have a pre-determined and set

duration for the specific type of challenge. For instance, a Wireframe competition will typically last between

2 – 4 days per round of active competing, while a System Architecture or a UI Prototype competition will last

around 10 days. Again, the more important factor is the predictability you are granted. When you trust you

will receive quality and innovative work by the set deadlines, you can work smarter, more efficiently, and

more effectively.

Do TopCoder competitions sometimes need an extension?

Of course, it does happen, but you can rest assured managing time extensions is sincerely easy and most

often, it wouldn’t be you needing to make the adjustment, your Copilot – which we will cover during the

growth & scaling portion of this book - can handle this for you.

PAGE0910 BURNING QUESTIONS ON CROWDSOURCING

ANSWER:

Specifically on the TopCoder EOI Platform:

HOW LONG SHOULD MY CONTESTS RUN FOR?

Page 11: TopCoder Crowdsourcing

PAGE1010 BURNING QUESTIONS ON CROWDSOURCING

Why this question matters to you?

This is all about your time and understanding your responsibilities. If

you and your co-workers indeed must be the one choosing the

winners, it behooves you to set aside time to do this extraordinarily

important task. That time should be factored into your work schedule

and this responsibility is something to take quite seriously – after all,

the winning submissions are your outputs! There are times when

choosing a winner should not be your responsibility, for example if

you are too busy, or would like technical experts to serve as judges.

From the standpoint of the community or crowd you are working

with, judging has to be absolutely transparent, fair to the members,

and when appropriate, standardized. To keep a community engaged

and producing creative outputs, fairness and consistency in judging is

paramount. Selection of winners must happen on expected schedule

so that the individual members can most effectively manage their

projects and workload.

UNDERSTAND YOUR RESPONSIBILITIES.

IF YOU ARE DECIDING A WINNER, DEDICATE

TIME TO YOUR SELECTION PROCESS AND

ALWAYS TREAT THE COMMUNITY OR

CROWD MEMBERS FAIRLY. STICK TO YOUR

PRE-DETERMINED SCHEDULES AND FINISH

YOUR CONTESTS ON-TIME.

PART 1 - LOGISTICSHOW DOES CROWDSOURCINGLOGISTICALLY WORK AND WHAT MUSTI DO TO GET STARTED?

HOW DO I PICK THE WINNERS?

BURNING

QUESTION #3

? ?

?

?

?

?

?

?

?

[ ][ ] [ ][ ][ ][ ]

Page 12: TopCoder Crowdsourcing

PART 1 - LOGISTICSHOW DOES CROWDSOURCINGLOGISTICALLY WORK AND WHAT MUSTI DO TO GET STARTED?

PAGE1110 BURNING QUESTIONS ON CROWDSOURCING

TopCoder canvasses 3 distinct “Pillars of Digital Creation” including FEI or Front-End Innovation, Software Development, and Analytics & Algorithms. Let’s quickly visit your responsibility in each of these pillars as it pertains to selecting winners.

Specifically on the TopCoder EOI Platform:

Selecting Winners on Each Track

HOW DO I PICK THE WINNERS?

Depending on the type of competition, your involvement in choosing the winners will vary tremendously.

Why? Well, if it is a subjective contest, such as a design competition, it is very likely you would prefer to

choose the design you and your team feels is the best fit. Why would you hand over that decision to anyone

but your team? Perhaps you might see benefit in doing so to have an outside team pick their favorite to see

how it compares to your choice, but ultimately, the final decision is likely best to remain with you.

Then there are contests where the outputs can be measured against a certain set of pre-determined criteria.

For instance in a data compression competition, you simply might be looking for the solution that performs

the greatest while retaining a certain level of identified data integrity. Here, the winner would simply be

decided by the performance of the solution.

ANSWER:

FRONT-ENDOF INNOVATION

SOFTWAREDEVELOPMENT

ALGORITHMANALYTICS

SUBJECTIVE(Client Decides Winner) (Rigorous Scorecards

Decide Winner)

OBJECTIVE(Best Solution against

pre-determined metrics wins)

PERFORMANCE

Page 13: TopCoder Crowdsourcing

PAGE1210 BURNING QUESTIONS ON CROWDSOURCING

PART 1 - LOGISTICSHOW DOES CROWDSOURCINGLOGISTICALLY WORK AND WHAT MUSTI DO TO GET STARTED?

WHO ARE PEER REVIEWERS AT TOPCODER?Reviewers at TopCoder are community members who have been elevated to this position of status within the community. They are offered this position based on consistently strong work and must retain a certain level of competition and success in that domain to remain a peer reviewer. We utilize peer reviewers for software developmentcompetitions.

LET'S DEFINE IT

Front-End Innovation (FEI) deals with mainly subjective contests where design and user-experience are the

focus. These can be idea or conceptualization contests, UI/UX innovations, storyboards, icons, click-able

prototypes, and more. For these types of contests you and your team are responsible for selecting your

favorite submissions.

Software Development is completely different. Deciding between a “better” piece of Java code could

engulf your entire day. Instead at TopCoder, each piece of software developed via competition is peer

reviewed by 3 community members who utilize didactic and language specific scorecards. These scorecards

- and what they seek to grade - can (and often are) edited to fit a specific client’s needs. The work is graded,

the top rated piece of code wins, any remaining flaws are documented, fixed by the winning competitor,

and the final output is your deliverable.

Algorithms & Analytics is most often completely objective. Well before an algorithmic competition or Big

Data challenge commences, our team works with you to highly define the parameters of success and how

the solutions will be graded against one another. At the end of the specified competition period, there is a

clear winner that simply performed the best within the parameters pre-described. There is one exception

in this “pillar”. We also host Big Data ideation competitions that ask our global community to provide ideas

on how they would utilize a large data set and the assets and applications they would seek to create from

the data. This is more of a “Front-End” exercise and therefore the outcomes are subjectively evaluated and

judged.

Page 14: TopCoder Crowdsourcing

PAGE1310 BURNING QUESTIONS ON CROWDSOURCING

PART II: EXECUTIONHOW DO I PRODUCE IN OPENINNOVATION AND CROWDSOURCING?

???

You now have a better understanding of how it all works. But

understanding why and executing how are quite different. This

part will concentrate on your execution and how your consistent

approach will lead to repeatable success in Crowdsourcing and

Open Innovation.

As you read through this section of the starting guide, think about

how good habits are formed and how consistent execution beco-

mes a reality. No matter what you are executing on, consistent

success doesn’t happen overnight and it certainly doesn’t come to

be without effort and practice. Treat your Open Innovation and

Crowdsourcing execution in the same light. Mastery of these

tactics and techniques are not a singular event, but rather a

process of learning, experimenting, and achieving.

Yes! You have now decidedto utilize Open Innovation andCrowdsourcing to succeed like never before!

Page 15: TopCoder Crowdsourcing

PAGE1410 BURNING QUESTIONS ON CROWDSOURCING

HOW DO I PREPARE A PROJECT OR A SERIES OF PROJECTS FORSUCCESS?

BURNING

QUESTION #1

Why this question matters to you:

Lather, rinse, repeatable success: Crowdsourcing and Open

Innovation shouldn’t be about running one isolated competition,

never to revisit the methodology again. If that were the case, you

probably would not have decided to download and digest this

starting guide. This question matters because as you master the art

of Open Innovation you will learn from your own experiences, what

works best, what can save you time and ultimately, how it can help

you predictably & repeatedly steer Crowdsourced competitions to

resounding success. After all, that’s why you’re here – to succeed.

From the standpoint of the community or crowd you are working

with, preparation of the inputs heading to the members is crucial.

Vagueness, unclear goals, a lack of necessary shared domain, and

other negative factors can sincerely disrupt your contests and

impact your results. Be comprehensive. Be clear, and anticipate

questions the community or crowd will likely ask, and prepare

those answers in advance.

?

?

?

?

?

?

??

?

?

PART II: EXECUTIONHOW DO I PRODUCE IN OPENINNOVATION AND CROWDSOURCING?

Page 16: TopCoder Crowdsourcing

PART II: EXECUTIONHOW DO I PRODUCE IN OPENINNOVATION AND CROWDSOURCING?

Everything detailed above is solid advice. We see idea competitions launched routinely where our clients purposefully keep the input to 2 to 3 short paragraphs because they desire the widest “lens” of potentialideas and solutions. Conversely, we host Big Data and algorithmic challenges where the problem set is defined in amazingly specific ways and the input into the community is quite a large set of data and highly-defined specifications. The specification needs to match the desired action. But there is another, and potentially more important, piece for you to understand about the TopCoder Platform and specifications.

MANY OF OUR CLIENTS PREFER TO NOT CREATE THEIR OWN SPECIFICATIONS FOR CONTESTS THEY ARE RUNNING ON TOPCODER. SO WHERE DO THE SPECIFICATIONS COME FROM?

TopCoder has contests where the output is the specification document. This allows our clients to really stick to strategy & execution and to avoid getting bogged down in transforming real world projects into atomized competition inputs. Some clients prefer to create their own specifications while others utilize these specialized contests to produce the documentation for them. It is your choice.

Specifically on the TopCoder EOI Platform:

PAGE1510 BURNING QUESTIONS ON CROWDSOURCING

HOW DO I PREPARE A PROJECT OR A SERIES OF PROJECTS FOR SUCCESS?

In the world of digital creation, developing well written, appropriately detailed specifications is paramount to success. First and foremost, ingratiate yourself with how the community or crowd you are about to engage is accustomed to receiving information. Look at other examples of inputs that community has digested in the past and ask specific questions as to why certain inputs seemed to be favored over others.

Secondly, understand the right level of detail you will need to provide and for which type of contest. If you are seeking solutions that will be judged subjectively, you should purposefully keep the specification or contest input more obtuse and allow for the natural creativity of your community to arise. Part of Open Innovation and Crowdsourcing success is your willingness to be surprised! Then there will be other competitions where rigorous documentation and knowledge share of a specific domain is simply essential to procuring a highly valuable contribution from the community. Think hard about how much needs to be shared, prepare that documentation in a way the community is accustomed to receiving it, and make yourself or a subject matterexpert from your team available to answer questions that will come from the community.

Finally, ask your platform “provider” if they offer help in creating the specification. You’ll read in the TopCoder EOI section (just below) why this might save you a great deal of time and effort, and result in superior outputs as well.

ANSWER:

Page 17: TopCoder Crowdsourcing

PAGE1610 BURNING QUESTIONS ON CROWDSOURCING

PART II: EXECUTIONHOW DO I PRODUCE IN OPENINNOVATION AND CROWDSOURCING?

HOW DO I ENSURE THE RIGHT TALENT IS WORKING ON MYPROJECT OR BUILD?

BURNING

QUESTION #2

Why this question matters to you:

The right talent will most often procure the right results. OpenInnovation and Crowdsourcing are focused on the outputs thecompetition will breed. The more specific the task or project - themore hyperspecialized the talent you will want to attract to the work.

Attracting 700 coffee-shop baristas will likely do you little good if the project is to rebuild a ‘6Os era “muscle car” engine. If you are not engaged with the right community (or crowd) for the work or innovation you seek to accomplish, your chances of success diminish greatly. It should be noted: There is a huge difference between the need to recreate something that already exists – a 60’s era “muscle car” engine – versus something that has yet to be solved or created. When the object of the contest is more “open” because no solution yet exists, we often see that near-field repurposing of knowledge and skills can play a crucial role in helping to formulate the new solution.

From the standpoint of the community or crowd you are working with, the individuals that comprise the “crowd” have very specific skill sets and personal desires. Put simply, they really enjoy doing certain types of work that they are either already good at, or want to learn and master. But like everyone else, their time and bandwidth (individually) is limited, so they will gravitate towards work they enjoy, work that is rewarding – both financially and personally – and, work that fits their personal bandwidth, meaning the size of the “ask” is a crucially important factor.

?

?

?

?

?

?

??

?

?

Page 18: TopCoder Crowdsourcing

Depending on which community, or crowd, or network of solvers, or platform you choose to engage will

determine greatly your ability to repeatedly access the hyperspecialized talent you need in order to succeed.

Sure, if you can come up with an amazing dollar prize amount and can craft a contest so worthy that all the

technology and social blogs are sharing the challenge for you; that could also do the trick. But, the big

difference is, that path is nearly impossible to scale and again, you’re not here to run one competition.

Rather, your focus needs to be on the repeat-ability of the process, so that you can drive extraordinary value

and outcomes for your team and potentially transform how your enterprise approaches innovation and

productivity.

It begins and ends with our amazingly skilled and continuously growing community. TopCoder is a community of now over 485,000 designers, developers and data algorithmists. Besides being a large community made up of mainly hyperspecialists; the TopCoder platform itself is equipped with a prediction algorithm that alerts you, the user of the platform, if your scheduled contest is not attracting the right-level of talent to your specific contest. How do we know? We measure a litany of performance metrics for all members, not just ranking their skills in specific disciplines, but also tracking their individual reliability ratings and myriad other factors.

On the occasion when the platform is predicting a less than optimized result, your contest will not fully launch and you’ll be asked to revisit your specification or monetary offering. Perhaps the problem wasn’t well enough defined or perhaps it’s more complex than you think and therefore a small increase to the prize money offered is in order.

On the TopCoder platform you will not have to worry, or spend time attempting to attract the right talent to your work. The platform handles this for you.

This occurs when a person who is highly trained in a skill or trade applies a specific piece of knowledge they have learned practicing in their domain over to another field. The individual is repurposing what they already know to help solve a problem in a field they have little or no professional experience in. Often, repurposing of knowledge brings a fresh approach, or idea, or solution to a challenge that “in-field” experts simply would not have attempted or even thought of trying.

Specifically on the TopCoder EOI Platform:

LET'S DEFINE IT

WHAT IS NEAR-FIELD REPURPOSING?

ANSWER:

PAGE1710 BURNING QUESTIONS ON CROWDSOURCING

PART II: EXECUTIONHOW DO I PRODUCE IN OPENINNOVATION AND CROWDSOURCING?

HOW DO I ENSURE THE RIGHT TALENT IS WORKING ON MY PROJECT OR BUILD?

Page 19: TopCoder Crowdsourcing

PAGE1810 BURNING QUESTIONS ON CROWDSOURCING

PART II: EXECUTIONHOW DO I PRODUCE IN OPENINNOVATION AND CROWDSOURCING?

HOW DOES ITERATION AND SUBJECT MATTER EXPERTISE WORK WITH THE “CROWD”?

BURNING

QUESTION #3

Why this question matters to you:

It may be high time to debunk a myth. Crowdsourcing and Open

Innovation are not magic. You do not simply post a contest, only to

ignore the contest for a set period of time, and receive amazing

results in return. It just doesn’t work that way. You need to be

aware that Open Innovation and Crowdsourcing do take effort and

specific effort at that. You will be involved in the success of your

Crowdsourced projects and in at least two very specific ways –

providing iterative feedback and managing your internal subject

matter experts, ensuring knowledge flows properly from your

internal organization out to the community you have decided to

engage. Be prepared to engage work differently, but still be

prepared to work!

From the standpoint of the community or crowd you are working

with, iteration with the client is a crucial step to their individual

success. Remember, the community member is here to succeed

too! Their efforts and their ability to create outstanding assets often

hinges on receiving quality feedback from the client. Regarding

access to subject matter experts; if a particular challenge is complex

and the necessary domain knowledge is not shared from your team

to the community or crowd, you likely will find participation quite

light. The converse is can also be true. If domain is properly shared,

the community will likely respond in-kind, driving participation and

the volume of quality submissions higher.

?

?

?

?

?

?

??

?

?

Page 20: TopCoder Crowdsourcing

Iteration: Open Innovation can be quite collaborative and iterative if you approach the work properly and you

are engaged with a community that fosters such activities. Often while running an Open Innovation competition

you are going to be presented outputs that surprise you. It could be a nuanced and clever idea, a really sharp

icon design, or a new approach to a solution your team would likely never have come up with.

Very often, outputs, as they are first presented, are not finished - or perhaps better put – they can use some

level of refinement and editing. This is where iteration comes into play. If you are able to offer feedback and

specific guidance to competitors during a competition, you should. Be clear, be honest, be respectful, and

communicate what is succeeding so far, and which areas of the design or solution you feel can be improved.

Take this role of providing critical feedback very seriously. You will be amazed how well competitors will take

good direction and how advanced their new outputs will be because of the quality feedback you provided.

Subject Matter Experts: Depending on the type of challenge or contest you are seeking to run, you may need to

consider how you will effectively transfer knowledge to your community or “crowd” so that they can effectively

work on your problem or task. The greatest step you can take is to properly facilitate this knowledge share.

Prepare your internal team well before hand and alert the specific “players” who may need to be involved that

they will be required to spend time either presenting or answering questions in which they have a certain

domain expertise. Ask the community or Open Innovation “provider” before hand how this is handled and press

for a detailed answer. If you can’t effectively share knowledge outwardly with a community or “crowd” it might

prohibit the range of work and innovation you can attempt.

ANSWER:

HOW DOES ITERATION AND SUBJECT MATTER EXPERTISE WORKWITH THE “CROWD”?

PAGE1910 BURNING QUESTIONS ON CROWDSOURCING

PART II: EXECUTIONHOW DO I PRODUCE IN OPENINNOVATION AND CROWDSOURCING?

???

Page 21: TopCoder Crowdsourcing

WHAT IS COLLABORATIVE QUESTIONING?

When domain expertise is being shared in an open forum session on the TopCoder Platform, each participant has the advantage of hearing answers to questions they themselves may never have thought to ask. Collaboratively, the community asks scores of questions. Individually, they benefit from the answers and knowledge transfer taking place and that can assist them in their pursuit to create a better solution for you.

LET'S DEFINE IT

PAGE2010 BURNING QUESTIONS ON CROWDSOURCING

PART II: EXECUTIONHOW DO I PRODUCE IN OPENINNOVATION AND CROWDSOURCING?

Iteration is a key component to success for many TopCoder competitions. As you might imagine, the more

subjective contests – that favor design and user experience – are where we see the highest level of iteration.

In our design (Front-End Innovation) contests we routinely have what is called a “Checkpoint”. This is a

simple pause, typically near the middle of the competition where the competitors’ initial submissions are

reviewed, feedback is generated by the client, and then delivered back to the members so they can digest

the critique and make the adjustments they believe will help them win the competition.

Subject matter experts can also be crucial to a contest’s success. Often at TopCoder, we will have clients who

want our community to work in a specified technology environment or perhaps work on a specific

algorithmic challenge where domain knowledge can play a pivotal role in shaping their attempts, and thus

their outputs. When this is the case, it is imperative that we foster the lines of communication as much as

possible between an internal team’s subject matter expert(s) and our community. Through the use of

scheduled forum sessions – where subject matter experts participate in a highly active Q&A exercise with

interested community members – our clients and competitors engage in collaborative questioning and the

necessary knowledge sharing takes place. We augment these collaborative sessions with the needed written

documentation so our community can access additional information they feel they might need in order to

compete at a high level.

Specifically on the TopCoder EOI Platform:

Page 22: TopCoder Crowdsourcing

PAGE2110 BURNING QUESTIONS ON CROWDSOURCING

PART II: EXECUTIONHOW DO I PRODUCE IN OPENINNOVATION AND CROWDSOURCING?

HOW DO I MANAGE MY OPEN INNOVATION & CROWDSOURCED COMPETITIONS?

BURNING

QUESTION #4

Why this question matters to you:

You can master managing Open Innovation competitions just as you

can master managing an internal team or external outsourcing

partner. Yes, the work itself is different, but as we discussed earlier,

there is still work to do – Crowdsourcing is not magic! If a process of

managing the work breaks down, it doesn’t matter what methodo-

logy you are incorporating to get the work accomplished, you will

greatly limit your success. However, if you learn to manage Open

Innovation contests effectively, allowing you to have assets created

in a massively parallel fashion, you will realize success and

productivity on levels you never thought possible.

From the standpoint of the community or crowd you are working

with, individual members like to participate in contests and

challenges where the management of the contest is professionally

executed and consistent. Community members will also gravitate

towards your specific challenges and contests once your history and

reputation intra-community is established.

?

?

?

?

?

?

??

?

?

Page 23: TopCoder Crowdsourcing

PAGE2210 BURNING QUESTIONS ON CROWDSOURCING

PART II: EXECUTIONHOW DO I PRODUCE IN OPENINNOVATION AND CROWDSOURCING?

Some of the below has already been canvassed in this starting guide, but it bears repeating. After preparing

your contests for success, by ensuring the right level of detail has been shared and that you have attracted the

right level of talent to your work or challenge, your focus should shift to managing the contest(s). So where

should you start? First, you will need to learn to let go. Remember you or no longer doing the “work” and your

role has therefore shifted. Instead focus on making sure your community or crowd has everything they need

answered and when you can, anticipate what they will ask for and provide it at the onset.?

Second, when questions arise from your community about a particular project or task – and please note they

will arise and this is a good sign of healthy participation – answer your potential contributors fast and as

thoroughly as possible. Remember, it is highly probable that people within your community or crowd speak

English as a second language. Don’t be afraid to over-explain details, this will help you avoid any miscommuni-

cation and time wasted because of it. If you have visuals, screenshots or picture examples you can share, do

so if it helps to more effectively communicate your answers. Be clear!

Finally, learn to work in a massively parallel fashion. We will discuss scaling Open Innovation later in this

starting guide. For now, understand that you should challenge yourself and seek to conduct several contests

at one time. Why? This will force you to not revert back to bad habits of “doing” the work and instead fill your

time with orchestrating multitudes of solutions. This is where you want to be. You are a conductor of grand

value, so emerge as the maestro by challenging yourself to successfully manage multiple contests at one time.

When you arrive at this point, your value soars, you accomplish more, and you successfully innovate faster

than ever before.

ANSWER:

HOW DO I MANAGE MY OPEN INNOVATION & CROWDSOURCED COMPETITIONS?

???

? ?

??? ??

Page 24: TopCoder Crowdsourcing

PAGE2310 BURNING QUESTIONS ON CROWDSOURCING

PART II: EXECUTIONHOW DO I PRODUCE IN OPENINNOVATION AND CROWDSOURCING?

There are two significant things to know about the TopCoder platform.

First, it is actually a platform in that it is constructed so that you can launch, manage, and finish Open

Innovation & Crowdsourced contests on-demand. Below are few snapshots from what would be your

dashboard view which we call “Cockpit”. All of your contests, all of your Copilots (more on their role in just

a moment), your budgets, timelines, forums and more, neatly laid out for you so you can manage multiple

competitions simultaneously. It is a true productivity platform.

Again, Open Innovation for your business or enterprise is not about running one competition, but rather it is

focused on force-multiplying your team’s capabilities to achieve much more production in what becomes a

continuous innovation cycle. To achieve this level of repeatable success we created and deliver “Cockpit” to

every client and pair the platform with the training needed so you can get going fast, ramp-up your

production over time, and achieve grand scale as you master Open Innovation management.

Specifically on the TopCoder EOI Platform:

Page 25: TopCoder Crowdsourcing

PAGE2410 BURNING QUESTIONS ON CROWDSOURCING

PART II: EXECUTIONHOW DO I PRODUCE IN OPENINNOVATION AND CROWDSOURCING?

TopCoder Cockpit Dashboard

TopCoder Cockpit Project Analytics

Page 26: TopCoder Crowdsourcing

PAGE2510 BURNING QUESTIONS ON CROWDSOURCING

PART II: EXECUTIONHOW DO I PRODUCE IN OPENINNOVATION AND CROWDSOURCING?

We just introduced the terminology of TopCoder Copilots. Let’s define their role and how they help you execute Crowdsourced competitions on-demand.

Copilots are your key to scaling your individual efforts. They manage the technical aspects of crafting, launching, and managing competitions all the way through successful delivery. They do the technical heavylifting and manage the ‘process of Crowdsourcing’ so you can concentrate on strategy and scale. As they answer forum questions and keep your contests on track for success, you focus on the orchestration of all of your competitions with a keen focus on delivering timely feedback and the needed information that results in cleanly written specifications.

Envision yourself, directing 3 or 4 Copilots at one time, and they launching 2 or 3 competitions for you each in parallel. Now, you are orchestrating 6, 8, or even 11 contests at once, each set to deliver back to you a valuable asset. That is we call force-multiplication.

Now envision several of your most productive team members multiplying their outputs and delivering for your enterprise or business at 3X, 5X, or even 10X their traditional productivity levels. Do the work necessary to achieve this type of scale with your team and your value as a strategic leader who delivers results will soar.

WHO ARE TOPCODER COPILOTS?

Copilots are an elite pool of TopCoder Community members who have proven to be exceptional technical managers of the TopCoder Platform. They help you through all phases of competition – from organizing details for proper specifications, to answering community forum questions, to offering their feedback on what they deem are the strongest solutions - and they are incentivized on the successful completion of the contest. They are solely focused on your success for each contest.

LET'S DEFINE IT

Copilot Competitors

Architecture

Assembly

Testing

UX Idea Gen

Rapid Prototyping

Big Data Challenge

Optimization Algo

Storyboard

Wireframes

Concept

Assets

MASSIVELY PARALLEL PRODUCTION OF INNOVATIVE ASSETS

Page 27: TopCoder Crowdsourcing

PAGE2610 BURNING QUESTIONS ON CROWDSOURCING

PART III – GROWTHHOW DO WE SCALE OPEN INNOVATION AND CROWDSOURCING AND CONTINUOUSLY CREATE VALUE?

It’s really up to you. Today, global communities and production

platforms exist and your willingness to re-think “work” and push the

boundaries of what can be accomplished via Open Innovation will

determine how much you can get done. This part of your starting

guide is all about your growth, your innovation, and your success.

Individually, you are likely looking to grow your skills and emerge as

an innovation expert within your enterprise or business. That emer-

gence can be your reality and you can achieve amazing growth.

Build on all that you have learned so far and begin to think about

real world projects and scenarios you can envision benefiting from

Open Innovation. Think about how you would structure these

projects, the subject matter experts you might bring to the commu-

nity or crowd, and more in order to achieve success. As you read

through this final part of the starting guide, begin honing in on the

details and thinking through them in your mind, or better yet,

mapping them out visually.

Is Crowdsourcing a nice marketing tool to get eyes to your work or is it transformational, allowing you to innovate faster and produce more, while taking on less risk?

Page 28: TopCoder Crowdsourcing

PAGE2710 BURNING QUESTIONS ON CROWDSOURCING

PART III – GROWTHHOW DO WE SCALE OPEN INNOVATION AND CROWDSOURCING AND CONTINUOUSLY CREATE VALUE?

HOW DO I ACCOMPLISH COMPLEX AS WELL AS SIMPLE TASKS?

BURNING

QUESTION #1

Why this question matters to you:

Are you here to transform how you approach productivity and

innovation or are you here to source more affordable ways to

accomplish fairly mundane tasks? Ask yourself this honest

question. If your answer is the former, than how much you can

accomplish via the methodologies matters tremendously.

From the standpoint of the community or crowd you are working

with, individual members tend to be attracted to challenging and

complex work if it is well organized and properly defined. There are

myriad reasons why, ranging from greater potential prize money

earned, their desire to work on challenges their daily “life” does not

present to them, their want to solve something that has no existing

solution, community peer recognition, and more can motivate a

community member and get them excited to help you succeed.

?

?

?

?

?

?

??

?

?

Page 29: TopCoder Crowdsourcing

PAGE2810 BURNING QUESTIONS ON CROWDSOURCING

PART III – GROWTHHOW DO WE SCALE OPEN INNOVATION AND CROWDSOURCING AND CONTINUOUSLY CREATE VALUE?

HOW DO I ACCOMPLISH COMPLEX AS WELL AS SIMPLE TASKS?

Today’s Crowdsourcing and Open Innovation landscape is maturing rapidly. Sure, there are ample solutions and “crowds” that now exist that can help you do a mundane task; such as physically waiting in line at the DMV for you, or localization services that utilize crowds to translate documents. These can save you time and money. They can make you more efficient, but not more innovative.

The other side of Crowdsourcing encompasses projects, contests, and challenges that are truly innovative. Where an unknown or optimized solution is what you are seeking, and it doesn’t exist yet! These types of challenges and contests are more properly defined as Open Innovation.

ANSWER:

To recap this important distinction:The output of a Crowdsourced task is the accomplishment of the task itself, likely done at some cost savings and potentially much faster. The focus is on efficiency.

The output of an Open Innovation contest is a new (or newly optimized) solution or asset that did notpreviously exist. The focus is on innovation.

In today’s “market”, you have access to communities and crowds that will help you accomplish both.

Crowdsourcing & Open Innovation

Micro-tasks/Crowdsourcing• Cost Savings

Faster ResultsFree-Up Your Internal ResourcesFocus on Higher Value Task

New ProductsBreakthrough DiscoveriesExtreme Value OutcomesCommunity Building and Engagement

•••

Open Innovation••••

Bottom Line Efficiencies Top Line Growth

What are you trying to accomplish?

Page 30: TopCoder Crowdsourcing

PAGE2910 BURNING QUESTIONS ON CROWDSOURCING

PART III – GROWTHHOW DO WE SCALE OPEN INNOVATION AND CROWDSOURCING AND CONTINUOUSLY CREATE VALUE?

At TopCoder we have over 30 specialized competitions ranging from the very front-end of innovation - contests focused on ideas and concepts – all the way through Bug Hunts and Bug Races – contests toidentify, and then fix verified bugs. As a client you can choose to do a “less complex” singular competition – like a design contest – and then bring that asset back to your team to continue the work internally OR you can build a digital asset all the way through the software development life-cycle – from ideation through testing & maintenance. It is your choice.

This choice is made possible due to TopCoder’s process of atomization and one of the consequences of atomization is incredible flexibility you gain throughout the development life-cycle. At TopCoder we have delivered solutions to complex challenges and projects ranging from Big Data Bio-medics contests, mobile applications to be used in remote and desolate environments, energy harvesting for the International Space Station, legacy system modernization and beyond.

WHAT IS ATOMIZATION?

If you were to dissect a sleek iPad app and break it apart into the digital pieces that make up the whole working “unit” and the work that had to happen prior to final digital assembly, you would see highly specific accomplishments in areas such as icon design, user interface design, software engineering, application architecture, wireframes, storyboards, prototypes and more. At TopCoder, when we take on any project, simple or complex, we atomize the project into its smallest units of work. Each piece of the whole is hosted as a competition. That process is atomization.

Specifically on the TopCoder EOI Platform:

LET'S DEFINE IT

Page 31: TopCoder Crowdsourcing

PAGE3010 BURNING QUESTIONS ON CROWDSOURCING

PART III – GROWTHHOW DO WE SCALE OPEN INNOVATION AND CROWDSOURCING AND CONTINUOUSLY CREATE VALUE?

HOW DO WE PROTECT INTELLECTUAL PROPERTY?

BURNING

QUESTION #2

Why this question matters to you:

Much like we discussed earlier, if you are highly limited in what you

can attempt and therefore create via Open Innovation, then the

potential value to you and your enterprise diminishes. And that is

not why you are here. Protecting intellectual property is a

legitimate concern most every enterprise will have when assessing

Open Innovation. If it is not your concern, it will be someone else’s

from your team. Your best strategy is to understand the risks, the

process, and the platform you are engaging and make educated

decisions based on your understanding.

From the standpoint of the community or crowd you are working

with, the smaller the barriers to entry to participate in any given

contest or challenge, the greater the participation rates will likely

be. However, individuals that comprise a community typically

understand that additional layers of screening or an additional

security process sometimes is necessary so that they can work on

what is deemed by the client as an IP sensitive project.

?

?

?

?

?

?

??

?

?

Page 32: TopCoder Crowdsourcing

PAGE3110 BURNING QUESTIONS ON CROWDSOURCING

PART III – GROWTHHOW DO WE SCALE OPEN INNOVATION AND CROWDSOURCING AND CONTINUOUSLY CREATE VALUE?

HOW DO WE PROTECT INTELLECTUAL PROPERTY?

This question is impossible to answer in a generic fashion as we have in this section throughout this starting

guide. TopCoder does not want to begin to profess how other platforms and communities handle this

sensitive subject. What we can communicate generically is that everything has some level of risk. Whether it

is traditional hiring, outsourcing, or Crowdsourcing, you are dealing with people. With that said, specifically

on our platform, we take this matter extraordinarily seriously. In fact, an upcoming eBook from TopCoder

solely focuses on this topic and it canvasses 8 questions on intellectual property in Open Innovation that you

should be asking, and why they matter to you.

ANSWER:

As we just mentioned, an upcoming eBook will focus on this issue entirely, but here are some main points

to understand right now.

Various levels of NDAs – On our platform, you have the ability to require as many levels of NDAs as you feel

necessary. We have provided for clients, smaller select pools of participants – sometimes based on country

origin or other factors that matter to the client, and on occasion we have then asked these individuals to

sign more restrictive NDAs directly with the client. We offer this flexibility to our clients so that they can

reach their desired level of comfort with any given project. However, we guide our clients to be as open as

possible as often as possible as to not restrict participation from any community members who are looking

to compete.

The Role of Atomization – In the previous “burning” question we defined atomization, so chalk it up to

good timing because now we’ll share how atomization positively impacts your work from an intellectual

property sharing standpoint.

Specifically on the TopCoder EOI Platform:

???

?

?

Page 33: TopCoder Crowdsourcing

PAGE3210 BURNING QUESTIONS ON CROWDSOURCING

PART III – GROWTHHOW DO WE SCALE OPEN INNOVATION AND CROWDSOURCING AND CONTINUOUSLY CREATE VALUE?

As simply as this can be stated; when the work is broken down into its smallest pieces of the “build” and

launched as separate competitions, this process innately helps to protect the “whole” or entirety of the

project from being unveiled. Often, we have clients that simply cannot discuss an innovative project publicly

due to the competitive advantage they can potentially gain, so they opt to not share with our community

the entire “project”. They may still desire user interface innovation, or an algorithmic solution that speeds

up a computational engine, but they simply don’t feel comfortable unveiling the entire project to our

global community. When you need to, on-demand, you can highly isolate work, have the isolated work

created via TopCoder while your internal team works on pieces of the project that you perhaps deemed too

sensitive to be shared externally. Again, this is about your choice and your gained flexibility via Open

Innovation on the TopCoder Platform.

As we stated earlier, an upcoming eBook from TopCoder focuses solely on the challenges of IP in Open

Innovation. Subscribe to the TopCoder Blog at www.topcoder.com/blog/ (upper right corner of page) and

you won’t miss this eBooks’ release.

Page 34: TopCoder Crowdsourcing

PAGE3310 BURNING QUESTIONS ON CROWDSOURCING

PART III – GROWTHHOW DO WE SCALE OPEN INNOVATION AND CROWDSOURCING AND CONTINUOUSLY CREATE VALUE?

HOW CAN WE TRULY SCALE OPEN INNOVATION?

BURNING

QUESTION #3Why this question matters to you:

At the risk of entering “broken record” territory we will say the

following: Open Innovation is about the transformation of how you

approach productivity and continuous value creation. If it were

instead about hosting a singular “one off” challenge that you were

never likely to repeat (from a process stand-point) that might be

better classified as a complicated marketing initiative you were

attempting in order to get new eyes around your brand, with the

possible consequence of some gained innovative perception. You

may even have outstanding and innovative results from that “one

off” competition. But that isn’t repeatable, that isn’t scale, and

therefore that isn’t transformative.

From the standpoint of the community or crowd you are working

with, when you are consistently launching, effectively managing,

and executing contests through the delivery of the newly created

asset, the individual community members take notice. If they see a

robust pipeline of work stemming from you and your company,

paired with your professional and consistent approach to Open

Innovation management, they will continually register for and

compete on your work. Having that type of community “following”

can pay big dividends and help you to achieve scale even faster.

?

?

?

?

?

?

?$

?

?

Page 35: TopCoder Crowdsourcing

PAGE3410 BURNING QUESTIONS ON CROWDSOURCING

PART III – GROWTHHOW DO WE SCALE OPEN INNOVATION AND CROWDSOURCING AND CONTINUOUSLY CREATE VALUE?

HOW CAN WE TRULY SCALE OPEN INNOVATION?

There are 4 key factors you should look for in ascertaining whether a specific Crowdsourcing or Open Innovation platform can help you achieve scale.

They are:

Repeatability – How easily can you Lather. Rinse. Repeat.? How often can you launch “open” competitions and is the effort to do so easy or hard? Delve deep into the process of launching, managing, and closing out individual contests.

If it seems difficult to “operate” one competition at a time, you need to ask yourself how you plan on achieving the type of scale you truly want to arrive at. The solution, community, or platform you choose touse should allow you to create in a massively parallel fashion.

Predictability – For any given competition, contest, or challenge, you should have a legitimate understan-ding of the % of success you are likely to achieve. Of course, no prediction is perfect, but fundamentally,understanding how likely it is that you succeed (and understanding the true cost of time and money spentif your competition fails) plays a formidable role in how often you attempt. Open Innovation (and to a lesser extent Crowdsourcing) is at their core, methodologies that lower the level of risk to a specific threshold that allows for more creative attempts endeavored. Ask the very difficult question of how predictability is measured on any platform you are considering. The answer will likely reveal how mature the platform is, or is not.

ANSWER:

? ??

??

? ?

Page 36: TopCoder Crowdsourcing

Differentiation – If you can repeat only one type of task, contest or challenge – even if you can routinely

predict a favorable outcome – it likely limits how valuable that community or crowd can be for you. If you

are contracting a Crowdsourcing localization service to only handle your document translation, which may

save you money and speed up your translation processes, but as we discussed earlier; this does not make

you more innovative, it simply makes you more efficient. Your ability to canvass a wide variety of tasks and

challenges is an important aspect to how you will ultimately scale. The more types of specific work you can

accomplish and innovation you can attempt, the greater the transformation that can take hold.

Results – Open Innovation and Crowdsourcing models are mainly outcomes based, meaning you are paying

for best in breed results (the winning solutions) instead of paying for effort. The level of results you can

receive is directly correlated to the level of talent within any given community or crowd. This is as simple as

doing your homework. Understand what this community or crowd has accomplished, what it likely can

accomplish, and for “extra credit” ask how the platform would adjust to support a newer technology or task

it has yet to encounter. The past results of a community and how they likely will perform for you going

forward ties directly to your potential success. If you are confident on a particular platform, human nature

would dictate you will likely scale that platform to maximize the outcomes.

PAGE3510 BURNING QUESTIONS ON CROWDSOURCING

PART III – GROWTHHOW DO WE SCALE OPEN INNOVATION AND CROWDSOURCING AND CONTINUOUSLY CREATE VALUE?

Let’s quickly run through each of the 4 key factors impacting your ability to scale Open Innovation and

Crowdsourcing starting with repeatability.

Repeatability – With thousands and thousands of specialized contests specified, prepared, launched and

successfully delivered, TopCoder has the most experience the world over in repeatable Open Innovation

success. Pair this experience with Copilots, an enterprise ready management tool (Cockpit), and robust

analytics & reporting and you are delivered the world’s most comprehensive productivity platform the

moment you become a client. With TopCoder, you can “Lather. Rinse. Repeat.” on-demand, as many times

as you would like.

Predictability – Let’s start with a stat. TopCoder has a contest fulfillment rate over 90%. This means, better

than 9 out of every 10 contests are successful, and that means; delivered on time, at the predefined budget,

at a quality level all parties felt was strong. Let’s follow that up with a point. For the less than 9% of

competitions that fail, you don’t pay. You’ve heard the mantra: Fail fast, fail often, fail cheap. We procure

that “type” of failure, which helps ensure your success.

Specifically on the TopCoder EOI Platform:

Page 37: TopCoder Crowdsourcing

PAGE3610 BURNING QUESTIONS ON CROWDSOURCING

PART III – GROWTHHOW DO WE SCALE OPEN INNOVATION AND CROWDSOURCING AND CONTINUOUSLY CREATE VALUE?

Technically speaking: Almost every type of competition on TopCoder is tied to a predictive engine that takes

into account the level of specialized talent registering to compete on your work, their predictability rating

(measuring how often an individual submits a solution after deciding to register for the given contest), and

other factors. Before a contest even enters the competitive phases you have a clear understanding of the

likelihood of that specific contest’s success.

Differentiation – We discussed earlier TopCoder’s 3 Pillars of Digital Creation, spanning idea and design

contests, to software construction, all the way over to Big Data algorithmic challenges. There is a very large

variety of work you can accomplish and innovation you can attempt on the TopCoder Platform. The variety

of work you, your team, and your enterprise decide to make happen is just that, your choice. You can master

it all and produce like never before.

Results – As mentioned earlier, results are directly correlated to the talent you are accessing. The TopCoder

community is now over 485,000 members strong and in 2012 alone we hosted over 4,500 separate

competitions. With a fulfillment rate above 90%, our clients enjoy an outcomes based model that is focused

on results and their innovative success.

Page 38: TopCoder Crowdsourcing

PAGE3710 BURNING QUESTIONS ON CROWDSOURCING

PART III – GROWTHHOW DO WE SCALE OPEN INNOVATION AND CROWDSOURCING AND CONTINUOUSLY CREATE VALUE?

Why did you download this starting guide and read through what was a good deal of content? Perhaps it

was simply to learn about newer methodologies and lexicons spurred on by your inner curiosity to

continuously improve yourself. We applaud that, sincerely. But likely your want was deeper. Perhaps as an

enterprise leader, (or an aspiring one) you innately feel the change that is happening all around you. This

starting guide and your journey beyond reading it were always about your personal transformation and

your ability to dictate your own innovative path forward. The reality of the situation is simple. If you don’t

master the methodologies that breed Open Innovation and Crowdsourcing success, your competitor (or

soon to be competitor) will. And though we appreciate you downloading and consuming this starting guide,

and though it may be the catalyst for your next action, it is not a substitute for it.

Your next move in Open Innovation and Crowdsourcing is to get involved in some capacity. There are

communities to join that focus on an interest you hold dear, co-creation opportunities to compete on where

a specific skill of yours will serve you well, and of course, there are competitions to launch and manage to a

successful result. Any of these beginnings gets you closer to your mastery of Open Innovation, and the

result of your mastery is highly predictable. You will succeed like never before.

Five frogs sit on a log.

Four decide to jump in.

How many frogs are left on the log?

Five. Deciding is not doing.

Welcome, jump in!

YOUR NEXT MOVE:

BEGINNING YOUR OPENINNOVATION AND

CROWDSOURCING JOURNEY

Page 39: TopCoder Crowdsourcing

10 Burning Questions on Crowdsourcing:

Brought to you by

?

?

?? ?

??

?

?

?

?

?

?

?

?

YOUR STARTING GUIDE TO OPEN INNOVATIONAND CROWDSOURCING SUCCESS.

10 Burning Questions on Crowdsourcing:

Brought to you by

?

?

?? ?

??

?

?

?

?

?

?

?

?

YOUR STARTING GUIDE TO OPEN INNOVATIONAND CROWDSOURCING SUCCESS.