Content Solution Quick Start (June 2014)

download Content Solution Quick Start (June 2014)

of 34

  • date post

  • Category


  • view

  • download


Embed Size (px)


An annotated slide deck from a webinar hosted by Stilo International and conducted on June 24, 2014. The talk introduces tactics for moving a content solution project forward quickly while also attending to essential details.

Transcript of Content Solution Quick Start (June 2014)

  • Copyright Joe Gollner 2014 Content Solution Quick Start Program @joegollner
  • Commentary: Introduction This presentation was delivered as a webinar hosted by Stilo International on June 24, 2014. Initially titled a DITA Quick Start Program this talk is in fact more general than that. The talk does touch on how the Darwin Information Typing Architecture (DITA) encourages and supports quick start programs. The goal of this talk was to introduce some tactics that have proven useful in getting content solution projects off the ground quickly
  • An Acronym for All Seasons A good plan violently executed now is better than a perfect plan executed next week - George S. Patton It should still be a good plan
  • nalyze An Acronym for All Seasons urvey rticulate rototype
  • Analyze current content & processes Identify & prioritize improvement opportunities A Left-Right Combination: Left Jab nalyze urvey Solicit stakeholder inputs on opportunities & risks Gain insights into the political dynamics at work
  • rototype Document the change steps to be taken Explain the business drivers behind the changes A Left-Right Combination: Right Hook Illustrate new capabilities and keys benefits Make the improvement plan tangible & compelling rticulate
  • Commentary: The ASAP Acronym Adopting an acronym will always make things appear somewhat artificial. Hopefully it also makes them more memorable. In this case, ASAP reminds us that in each wave of activity should include an element of analysis (where we try to understand the needs & goals) and an element of engagement (where we try get stakeholders involved in the process). Hence Analysis is balanced by Survey (asking for inputs) & Articulate is balanced by Prototyping (showing what is possible)
  • Analyze: Adopt a Content Life Cycle Model Content Acquisition Content Management Content Delivery Content Engagement
  • Commentary: The Content Lifecycle Executive Management is typically familiar with quadrant models. This content lifecycle model works with this common structure for setting out the activities governing content lifecycles. Quadrants on the left are internal and those on the right are client facing. The upper two are focused on the content itself while the lower two on data & actions applied to content. See The Content Lifecycle
  • Content Engagement Stands out as the most novel element in this Content Life Cycle Model It focuses on how content is used & and how the user community can become actively engaged in a process of continuous & constructive change Radical Element: Content Engagement
  • Analyze: Apply an Evaluation Framework Content Delivery Content Acquisition Content Management Content Engagement 1 10 1 1 1 10 1010
  • Commentary: Evaluation Criteria Each quadrant is amenable to measurement and therefore improvement. It is possible to overlay an evaluation framework where each quadrant can be evaluated and assigned a score between 0 (non-existent capability) and 10 (excellent). The trick is to identify evaluation criteria that can be improved over time (increasing their objective nature) and that can be used to describe target capability in a meaningful way.
  • Analyze: Rate Capabilities & Targets Content Delivery Content Acquisition Content Management Content Engagement 2.4 4.2 As Is 2.8 4.0 8.3 8.77.8 To Be 1 10 1 1 1 10 1010 8.8 As Is Score: 45 To Be Score: 281
  • Normally started in an information vacuum Identifying what can be, or should be, measured is a start On one project: Analyze: Evaluation Considerations Score Assigned 0 No Score 2 Poor 4 Weak 6 Fair 8 Good 10 Excellent Capability Level None Minimal Inadequate Adequate Competitive Industry Leading Optional Weighting Scheme Applied to Evaluation Criteria 0.5 Less important 1.5 More important Every criterion weighted as more important must be balanced by one that is rated as less important Calculating a Total Score Scores are assigned to each criteria for each quadrant. Scores for criteria are averaged & plotted on an axis from 0 to 10 for each quadrant. The area of the polygon that results is the total score. Marking Scheme Evaluation Criteria Competitiveness Consistency Responsiveness Maintainability Measurability Usability What is being Evaluated Benchmark comparisons against comparable organizations The consistency of content details across the collection (measure of reuse) The extent to which new demands can be met quickly & affordably Maintainability & supportability of the overall solution Completeness & quality of the measurement data provided Efficiency & intuitiveness of all user interactions (supporting user success)
  • Commentary Different organizations will have wildly different management cultures & wildly different views on what constitutes meaningful measurement. The approach introduced here is open to adopting whatever measurement strategies that an organization will accept. Note that different measurement criteria can be used for different quadrants. Also multiple criteria can be aggregated (e.g., averaged) into the measurement for a given quadrant.
  • Apply a structure to organize information About the current state (as is) limitations & problems About the future state (to be) improvement opportunities Establishes the basis for future refinements In what is measured In how it is measured In how measurements can be converted into financial terms Analysis results usually need corroboration Engaging stakeholders with influence and / or insight Leads to the need to survey Analyze: The Key Points
  • Formal & structured approach to gathering inputs Designed to collect information and insights in as authoritative a way as possible Analysis results can be provided with the survey A way for people to say what they really think Surveys designed to support both Quantitative research Qualitative research People provide initial response from choices then elaborate Survey: Putting the Analysis into Context
  • Research Questionnaire # Question 1 Based on your understanding, why is your organization interested in adopting a CCMS? 2 From your perspective, are there advantages in adopting a CCMS? 3 From your perspective, are there risks and challenges in adopting a CCMS? 4 How would you describe the culture of your organization? 5 How would you describe the culture of your particular work group? 6a Do you think the CCMS will change the culture in your work group? 6b1 If Yes to 6a: How do you think the CCMS will change the culture of your work group? 6b2 If No to 6a: Please elaborate on why you think the CCMS will not change the culture in your work group. 6c Do you think other members of your work group will be receptive to the change? 7 How would you describe the division of roles and responsibilities in your work group? 8 How would you describe your role and responsibilities within your work group? 9 Do you think the transition to the CCMS will affect your role and responsibilities within your work group? 10 During the CCMS transition, what role and responsibilities would you like to take on? 11 After the CCMS transition, what role and responsibilities would you like to take on? 12 How would you describe the status of your work group in your organization? 13 Do you think the CCMS will change the status of your work group in your organization? 14 How do you think other work groups in your organization view the CCMS initiative? 15 Likert-like scale (five levels from negative to positive with the middle value being neutral) applied to five perceived attitudes (respected, trusted, understood, valued, appreciated) 16 Are you comfortable with the prospect of learning and using new CCMS technology? 17 What are your expectations for a CCMS solution? 18 Do you have concerns about the transition to a CCMS? Survey: Ask Questions Online Survey Tools (e.g., Fluid Surveys) Anonymous responses Independent coding Kept brief Can be tailored to different stakeholder groups & research questions Sound methodology Project example
  • Adds a dimension to the analysis results Can highlight issues that call for specific planning measures In this case, users highlighted transition challenges due to resource overloading Measures were taken Survey: Leverage Responses Work Group Worth Transition Challenges Expected Outcomes Reactionsto OrganizationalChange Reactionsto TechnologyandProcesses Anticipated OrganizationalChange +5 +10 4 2 -2 -2 -2 +10 +3 6 5 -5 5 +4 +4 16 -11 -12 +16 +12 6 1 -9 -3 +1 10 8 -3 +10 +1 7 11 -4 TechnoCorp EduOrgPositive, Neutral, Negative codes applied to:
  • Commentary: S