Retrospective: Vision Chart

Retrospective: Vision Chart

Session 11: Wrapping Up

The Vision Chart project was successfully completed though the result is different than what was originally proposed with the most notable of change that there is no chart.

Goal: To create a chart of Cryptiquest’s progress toward the ideal state.

Objectives

ObjectiveTest
1. The chart shows current capabilities regarding the 9 attributes. *Changed
2. The chart shows the ideal capabilities of the 9 attributes. *Changed
3. The chart is placed under the About page on the company site . *Changed

*Changed: There is no chart but the information is still displayed in a list format; and, the 9 attributes of the ideal state were sussed out into 32 objectives (which were derived from the 9 attributes of the ideal state).

Origin

This project was spurred during the Compass Definition project, specifically with the line:

Perhaps, ultimately, this “current state” to “future state” comparison will show a map of milestones – a long term road map that is not based on brands but company needs.

from Compass Definitions: Session 01: Project Kick-off

Before this passage, the notes refer to trying to assess feasibility. That’s what this project was about: gauging what capabilities are required for Cryptiquest to realize its vision. Cryptiquest’s strategy should reflect the accumulation of these new capabilities on the pathway toward the vision.

Need: To gauge Cryptiquest’s current capabilities in relation to the ideal state.

Proposed Ideal Situation Vs. Actuality

The proposed ideal situation was a chart that shows the various gaps remaining before Cryptiquest can realize its vision. Bonus points if the chart mapped out milestones or steps toward filling these gaps.

In actuality, the outcome produced a list of objectives that are required before Cryptiquest can realize its vision. These objectives are conveniently listed in a way that points out a pathway toward completing the vision – strategy can now focus on unlocking the next objective.

Production Plan vs. Actuality

StepThe PlanActuality
1Assess and/or modify the “9 attributes” for measurabilityAssessed the 9 attributes that led to 40+ requirements
2Define Cryptiquest’s current capability for each of the “9 attributes”Assessed the objectives to reveal requirements which led to the 32 objectives and 100+ capabilities
3Assess and plan presentationAssessed and reduced the capabilities into 30 “capability packages”
4N/AAssessed and designed presentation

Session Rundown

Session 1: Kick-off
This session was dedicated to assessing the need, goal, objectives, and plan. While it seemed as it was off to a good start, it would quickly come to light that not enough research was done before committing to the production plan.

Session 2: Building Objectives
This session revealed the flaw in the production plan with an outcome of an unplanned list of objectives. This was an incredibly long, multi-day “session.” The decision was made to keep it as one long session rather than multiple smaller sessions and despite the heft of the copy it does seem more appropriate to leave it lumped together.

Session 3: Assessing Objectives
This session was spent identifying all the requirements that would be needed to make the 40+ objectives a reality. There were about 36 requirements once the objectives were analyzed.

Session 4: Assessing Requirements
This session was dedicated to assessing the requirements for points of consolidation and further requirements (some elements had requirements of their own) and when all was said and done, there were 100+ requirements – which became the capabilities that were needed.

Session 5: Assessing Capabilities
This session reworked the capabilities and requirements from the previous session and then sorted the objectives in order of priority. With objectives sorted in order, the capabilities associated with those objectives were also sorted, forming a loose algorithm of which capabilities should be obtained, when. These capabilities were then grouped by likeness to form “packages” resulting in about 16 items.

Session 6: More Capabilities
The capability packages generated in the previous session would only enable Cryptiquest to unlock the desired objectives but not necessarily provide the tools needed to meet those objectives 100%. So the objectives were analyzed against the list of capabilities to look for gaps. All-in-all, there were 30 capabilities identified (nearly double the previous list).

Session 7: Designing the Chart
There was a serious attempt at creating a visual chart but ultimately it was determined that a chart would not make for an efficient presentation and a list was preferred.

Session 8: Developing the Chart
Despite the fact that the “chart” was out in favor of a “list” the word “Chart” was still used since that was the name of the project (this brings up an Action Item that was generated during the Kick-off session for the Review Forum project, it was noted that projects names should reflect the problem and not the solution to avoid things like “Vision Chart” projects not ending with a chart). At any rate, the two lists were drafted for review.

Session 9: Chart Review
Like all review sessions, this was a disaster. It was half review-of-steps-taken and half “empty promises”. The misnomer “vision chart” was still being used to describe the strategy. This should have been a sign that something was wrong. The developer was unsure how to mesh the original need with the new product – a step was missing.

Session 10: Implementation of Our Objectives
This session was dedicated to the first of two products – the easy one – the objectives. Overall, it was fairly straight forward and in the end there was the implementation and launch of the “Our Objectives” page on the company site.

Session 11: Wrapping Up
This session was somewhat of a compromise. On one side there was a need to solve this “vision chart is not a vision chart but I keep calling it a vision chart” and on the other side there was a “neither this project nor future projects cannot move forward without wrapping this up” factor. Ultimately, the vision chart replaced the “Our Strategy” page on the company site and to enforce a hushed tone about it, the action was only briefly touched upon during this session and focus was pinned upon meeting the project’s objectives.

Issues

  1. The amount of work assessed in the kick-off session was grossly underestimated.
  2. The developer lost steam during the review phase.
  3. This project tried to change the name after the goal was determined but the products ended up not reflecting the project name.

Issues Addressed

  1. There was not enough research done before committing to the production plan; suggest add a research phase between compass statements and production plan.
  2. During the review phase, the intentions were brought to question and the developer second guessed everything and became confused. Ultimately, there was not a concrete “this product solves this need” since the product strayed from the original need.
  3. Look into changing names of projects from reflecting the proposed solution to reflecting the initial problem. This may mitigate the frequency of projects ending with different products than the project name suggests.

Action Items

Add the following to the project management documentation:

  • To address ISSUE 1, add a research phase between objectives and project plan.
  • To address ISSUE 2, there needs to be a need-analysis phase before the review phase where the product is scrutinized to answer the following questions:
    • What problem does this solve?
    • Who is the target audience?
    • Why are they the target audience?
  • To address ISSUE 3, there needs to be a project naming convention that focuses on need rather than solution.
  • The design process for new tools must include:
    1. A “collaboration assessment phase” to consider in what ways the new tool and old tools can collaborate and how they must change in order to do so.
    2. A “creator concept-to-launch assessment phase” to consider in what ways the new tool will guide creators from concept to launch.
    3. A “share and collaborate assessment phase” to consider in what ways the new tool will enable creators to share and/or collaborate on projects.
    4. A “accessibility assessment phase” to consider in what ways the new tool might abandon users based on ability.
    5. A “game format phase” to consider in what ways the new tool fits into all supported formats.
    6. A “engine integration phase” to consider in what ways content created by the new tool will come to life in the engine.
  • The following types of feature requests will have equally high priority:
    1. Tool collaboration
    2. Process guidance
    3. Collaboration and sharing
    4. Accessibility
    5. Game format
    6. Game integration
  • The design process for new media must include a “tool usage” phase to consider which tools should be used for the project.
Session 11: Wrapping Up