Session 06: More Capabilities

Session 05: Assessing Capabilities
Session 07: Designing The Chart

Hi there! Welcome to this session of the Vision Chart project where I am creating a chart to measure Cryptiquest’s progress toward meeting its vision. In the last session, a list of milestones was revealed which would end up activating each objective which would be used to measure that progress. In this session, I will debate whether that’s good enough to get us to the Vision Chart or not.

Let’s start with a debate. I currently have a workable stepping path pointing out the capabilities required to unlocking company objectives. But unlocking the objectives does not mean they are met. The crux of this debate is do I continue researching what capabilities are needed in order to meet the objectives?

An example to showcase what I’m referring to is one I used in an earlier session of this project: a goal is to unlock and inspire creators from virtually everywhere. If this is the case then there would be a need to reach non-English speakers and while the milestone list I drafted in the last session includes research for obtaining language data, it does not mention anything about obtaining translations. Obtaining translations is a huge endeavor and necessary for meeting the vision.

Also, a lot of the objectives refer to obtaining __% of a standard and the percentage was left blank on purpose since I don’t know exactly what should be considered realistic. Should those percentages be addressed now? Those percentages seem like qualifiers for more milestones.

On the other hand, it seems that by the time those objectives are activated and I need to worry about meeting the milestones, the state of the world may be completely different since it will take years, perhaps many many many years to get that far. Is it necessary to get into those details now?

The answer to that question is “Maybe?” The deciding factor here is how do I measure the success of my projects if I don’t have measurable objectives? And how can I have measurable objectives if they have a blank marker to measure against?

The first objective that will be activated is the following: {4} Objective 1.A.4: User expectation regarding ease of use for each tool is met at least __% of the time. So assuming I have some tools (setting crafter, character crafter, etc.) and I have an optimistic 1000 emphatic users and 100 of them provide feedback about their experience and 10 of those are negative. What math should I make in order to measure the objective? Ten negative responses out of 100 is a much higher percentage than ten negative responses out of 1000 users.

That’s one aspect: how to measure the number. The other aspect is how to measure success. Assuming the latter measurement 10/1000 is the better (I don’t think it is, but let’s assume it is) then that result would pass if the objective was 99% or lower. Perhaps the proper answer lies within the two (something like: no single feature request or problem is above 1% of feedback providers and the percentage of negative feedback is fewer than 1% of total users). This would be based on version release (with the percentages restarted upon new release).

Coming up with an answer for the blank percentage is easy (albeit tedious). But that’s because this isn’t based on any real data which is the problem. Perhaps the answer is once an objective is activated, there is a grace period to collect data. The data is still collected and measured but success isn’t based off arbitrary mandates.

Okay. So what does this mean for the vision chart? Can it be called a “Vision Chart” if it only leads to unlocking the objectives that measure vision progress? I don’t see why I can’t keep working out these objectives – not to come up with numbers but to come up with the “capabilities” required to meet those numbers.

More Capabilities

So, the capabilities discovered so far were required in order to measure the objectives, now we look at what capabilities are required in order to make those objectives met 100%. Let’s tackle this, objective-by-objective and for the sake of my sanity, I’ll only include NEW capability packages.

{1} Objective 1.A.1: User expectation regarding the suite of tools working seamlessly together is met 100% of the time.
{2} Objective 1.A.2: User expectation regarding process guidance (from concept-to-launch) for each tool is met at least 100% of the time.{3} Objective 1.A.3: User expectation regarding sharing and collaboration content for each tool is met at least 100% of the time.
{4} Objective 1.A.4: User expectation regarding ease of use for each tool is met at least 100% of the time.
Requirements:

  1. Tool Upgrade Team

{5} Objective 1.A.5: User expectation regarding game engine format is met at least 100% of the time.
{6} Objective 1.A.6: User expectation regarding game engine integration is met at least 100% of the time. 
Requirements:

  1. Engine Upgrade Team

{7} Objective 1.B.1: Tools are translated to support at least 100% of users, globally.
Requirements:

  1. Translation Team

{8} Objective 1.B.2: Solutions are discovered to provide tool access to at least 100% of communities, globally.
Requirements:

  1. Under-privileged Outreach

{9} Objective 1.B.3: Solutions are discovered to provide tool access to at least 100% of regions, globally.
Requirements:

  1. Regional Barrier Team

{10} Objective 1.B.4: Solutions are discovered to provide tool access to at least 100% of disenfranchised groups, globally.

  1. Disenfranchisement Researchers

{11} Objective 2.A.1: Media reviews report 100% positive on inclusivity.{12} Objective 2.A.2 Media reviews report 100% positive on uniqueness.
{13} Objective 2.A.3 Media reviews report 100% positive on consistency.
{14} Objective 2.A.4 Media reviews report 100% positive on entertainment.
{15} Objective 2.A.5 Media reviews report 100% positive on quality.
{16} Objective 2.A.6 Media reviews report 100% positive on concision.{17} Objective 3.A.1 Tools and media production follow project management standards.
{18} Objective 3.A.2 Tools and media production use Cryptiquest tools (as they are available).
{19} Objective 3.A.3 Tools and media follow draft and version protocols.
{20} Objective 3.A.4 Tools and media follow branding and legal guidelines.
Requirements:

  1. Quality Assurance Team

{21} Objective 4.A.1 More than four brands are profitable.
Requirements:

  1. Sales Team

{22} Objective 4.A.2 Brands are popular among at least 100% of creators, globally.
{23} Objective 4.A.3 The brands span at least three different media types.
Requirements:

  1. No new capabilities needed

{24} Objective 5.A.1 At least 100% of users/employees report that Cryptiquest is an honest company.
Requirements:

  • Audit team

{25} Objective 5.A.2 At least 100% of users/employees report that Cryptiquest treats its employees well.
Requirements:

  • Union

{26} Objective 5.A.3 At least 100% of users/employees consider employees of Cryptiquest to be industry experts. 
Requirements:

  • Knowledge Development Programs

{27} Objective 5.A.4 At least 100% of users/employees report having a meaningful relationship with Cryptiquest, its brands, or its products. 
Requirements:

  • No new capabilities needed

{28} Objective 5.A.5 At least 100% of users/employees consider partners of Cryptiquest to be industry experts. 
Requirements:

  • Partner Reputation Metrics

{29} Objective 5.B.1 Third party reviews of the company are positive 100% of the time.
Requirements:

  1. No new capabilities needed

{30} Objective 6.A.1 Cryptiquest is returned on the first page of results 100% of the time when users search for targeted keywords.
Requirements:

  1. SEO Team

{31} Objective 6.B.1 Cryptiquest is present at 100% of industry events, annually.

  1. Event Team

{32} Objective 6.B.2 At least 100% of users who go to industry events report seeing us at industry events.

  1. No new capabilities needed

Compiling the list of old capabilities with new, I have the following list that will lead Cryptiquest to its vision:

  1. Project Management Package
  2. Guide Extension
  3. Guide Creation
  4. Tool Extension
  5. Feedback Extension
  6. Tool Creation
  7. Concept-to-Publish Suite
  8. Game Engine
  9. Media Extension
  10. Media Creation
  11. Pubic Outreach Extension
  12. Event Attendance
  13. Digital Extension
  14. Community Bridge
  15. Global Outreach
  16. Humanistic Reputation
  17. Tool Upgrade Faculty
  18. Engine Upgrade Faculty
  19. Translation Faculty
  20. Under-privileged Outreach Faculty
  21. Regional Barrier Researchers Faculty
  22. Disenfranchisement Researchers Faculty
  23. Quality Assurance Faculty
  24. Sales Faculty
  25. Audit Faculty
  26. Union
  27. Knowledge Development Program
  28. Partner Reputation Metrics
  29. SEO Faculty
  30. Event Faculty

That is an overwhelming list coming from a guy with a laptop and little else. But I guess I’ll just take this one step at a time. This may need some tidying up but I’m confident that this is a stronger list than what was presented in the last session.

The next step? Producing the Vision Chart! That’s going to finally happen in the next session (I think). Either way, I’ll see you there!

Action Items

  • None

Session 05: Assessing Capabilities
Session 07: Designing The Chart