Session 02: Building Objectives

Session 01: Kick-off
Session 03: Assessing Objectives

Hey, hey! Welcome to this session of the Vision Chart project where I’m creating a chart that measures Cryptiquest’s progress toward its ideal state. In the last session, the project was researched and planned. This session will focus on production as outlined in the last session. This may take more than one session to resolve. Let’s get started!

To begin, let’s take a look at the proposed production steps and work from that:

  1. Assess and/or modify the “9 attributes” for measurability.
  2. Define Cryptiquest’s current capability for each of the “9 attributes.”
  3. Assess and plan presentation.

Assessing the 9 Attributes


Here are the nine attributes identified in the ideal state:

Cryptiquest’s Tools

  1. Our suite of tools works seamlessly together to guide creators from concept-to-design-to-production-to-release and includes collaboration and sharing functionality.

  2. Our suite of tools includes a game engine and tools in multiple formats, including 3D and paper, which is used to create games and bring content to life.

  3. Our suite of tools is accessible to creators of any and every identity, regardless of language, ability, class, region, or societal barriers.

Cryptiquest’s Media

  1. Our media highlights how our tools work, showing the draft process using the tools (when applicable).

  2. Spread among three media types, we have more than four brands that are profitable.

  3. All media we have generated upholds our core values.

Cryptiquest’s Image

  1. Our reputation, which is supported through positive third party reviews and industry awards, is that of an honest company known for treating its employees well and having a meaningful relationship with its audience.

  2. We have prominent presence as our target audience can find us easily when looking for us, whether that be via online search tools or at major industry events.

  3. We have employed and partnered with industry experts and “thought leaders” and are seen as a trusted source of information.

Right off the bat, it’s obvious that this is going to take far more work than identified in the kick off session. There are not 9 attributes – there are nine statements with multiple attributes! Ha. But let’s start from the first section and work our way down, identifying each measurable objective.

Sorting the “Attributes” by Facts

Here are the statements, disassembled into their factual elements, listed in outline fashion:

  1. Cryptiquest has created a suite of tools.
    1. The suite of tools work seamlessly together.
    2. The suite of tools guide creators through the creation process (from concept to release).
    3. The suite of tools includes collaboration and sharing functionality.
    4. The suite of tools is accessible to creators of any and every identity.
      1. Regardless of language
      2. Regardless of ability
      3. Regardless of class
      4. Regardless of region
      5. Regardless of societal barriers
    5. The suite of tools includes a game engine.
      1. The game engine is created in multiple formats (including 3d and paper).
      2. The game engine is used to create games and bring content to life.
  2. Cryptiquest has generated media.
    1. The media highlights how the suite of tools works.
    2. The media shows the draft process (when applicable).
    3. The media upholds core values.
  3. Cryptiquest has created brands.
    1. More than four of the brands are profitable.
    2. The brands span at least three different media types.
  4. Cryptiquest has a reputation.
    1. The reputation is supported through positive third party reviews.
    2. The reputation is supported through industry awards.
    3. The reputation is that of an honest company.
    4. The reputation is that of a company that treats its employees well.
    5. The reputation is that of a company that ha a meaningful relationship with its audience.
  5. Cryptiquest has presence.
    1. Cryptiquest is found when the target audience makes industry-related searches online.
    2. Cryptiquest is prominent at major industry events.
  6. Cryptiquest has expertise.
    1. Cryptiquest employs and partners with industry experts.
    2. Cryptiquest employs and partners with “thought leaders.”
    3. Cryptiquest is seen as a trusted source of industry-related information.

Analyzing the Factual Statements

(First of all, there are clearly more than nine attributes here. What a gross underestimate of work. I’ll have to take this up with the planning department. Ha.)

Seriously now, there is something that jumps out when the content is formatted in this way. Specifically, there are six prominent segments:

  1. Suite of Tools
  2. Media
  3. Brands
  4. Reputation
  5. Presence
  6. Expertise

Ultimately, the quality of these six segments define the state of Cryptiquest; instead of “nine attributes” there are “six segments.”

Now I have to go through each factual statement to determine if they are measurable and make them measurable if they are not. Essentially, these will result in the objectives that can measure Cryptiquest’s progress in each segment.


Building the Objectives


1. Suite of Tools

1.1. The suite of tools work seamlessly together.

How will this be measured? How seamless is “seamlessly”?

This is tough to measure without knowing all the tools that would be manufactured, but based on the ideas already started for CQ StoryHammer (there is a genre crafter, setting crafter, character crafter, plot crafter, and other tools for crafting story and RPG elements), I think “seamless” means that the tools would utilize the same building blocks, sharing crafting tools when possible, and “snap” together where appropriate.

Ultimately, regardless of how objectively seamless the tools had been designed, this will be a subjective call – which means it will be dependent on user feedback. If, for example, a user signifies that they wish there was a way for the equipment creation tool to link to the character crafting tool in order to fashion some sort of magical birth defect (admittedly weird case, but not outside the realm of fictional possibility) then that would immediately need to be addressed, researched, and added somewhere to the production queue. So how does this become a measurable objective?

  1. The design process for new tools must include a “collaboration assessment phase” to consider in what ways the new tool and old tools can collaborate and how they must change in order to do so.
  2. Feature requests regarding tool collaboration have high priority (save for urgent projects).

While these declarations help identify solutions to the objective, they are not the objective itself. In fact, I’ll create an action item to add them both for project management documentation. So then what is the measurable objective?

1.1. User expectation regarding the suite of tools working seamlessly together is met at least __% of the time.

How this data is collected might be through surveys or help inquiries or feature requests. The percentage might be capability-dependent. How exactly to measure this will have to be something addressed in a future state of Cryptiquest (once tools are being used by creators).

1.2 The suite of tools guide creators through the creation process (from concept to release).

Learning from all the research done in the previous factoid, let’s conclude that there should be action items to try to incorporate this in design phases of projects, that importance is placed on implementing a fix if users signify that they are missing something, and that the objective is going to be a measurement of user feedback.

  1. The design process for new tools must include a “creator concept-to-launch assessment phase” to consider in what ways the new tool will guide creators from concept to launch.
  2. Feature requests regarding process guidance have high priority (save for urgent projects).

1.2. User expectation regarding process guidance (from concept-to-launch) for each tool is met at least __% of the time.

Again, this will have to be addressed once Cryptiquest has tools and users.

1.3 The suite of tools includes collaboration and sharing functionality.

I’m tempted to make this a binary true/false statement (either the functionality is there or it isn’t). But perhaps this is better measured on whether collaboration and sharing are considered during each phase.

  1. The design process for new tools must include a “share and collaborate assessment phase” to consider in what ways the new tool will enable creators to share and/or collaborate on projects.
  2. Feature requests regarding collaboration and sharing functionality have high priority (save for urgent projects).

1.3 User expectation regarding sharing and collaboration content for each tool is met at least __% of the time.

1.4 The suite of tools is accessible to creators of any and every identity.

1.4.1 …Regardless of language

The pattern that had been used up to this point for converting factoids into objectives will not work there.

How do you objectively measure this? Do you have a checklist of every single living language and mark the ones that get translations as you go? Or perhaps it’s not about ticking off languages like stamping some lingual version of a passport.

Maybe it’s more about coverage? For instance, according the the EF English Proficiency Index (EF EPI), Sweden scores “Very High Proficiency” in English comprehension, which would reduce (not eliminate) the priority for translating the tools in Swedish. Obviously, this suggestion isn’t supported by expertise but the possibility of basing this on “coverage” with a goal of 100% should garner the same result.

1.4.1 Tools are translated to support at least __% of users, globally.

1.4.2 …Regardless of ability

The idea of “coverage” is one thing when it comes to language but when it comes to ability, it doesn’t translate well (no pun intended). People might find their own way to translate the documents but a person with color blindness might not even know there was something missing if the tool was designed without considering this factor.

But how does this get measured? My hunch is that this can follow the pattern used in previous objectives to try to mitigate failures, expedite failure fixes, and base success on feedback.

  1. The design process for new tools must include a “accessibility assessment phase” to consider in what ways the new tool might abandon users based on ability.
  2. Feature requests regarding accessibility have high priority (save for urgent projects).

1.4.2 User expectation regarding ease of use for each tool is met at least __% of the time.

1.4.3 …Regardless of class

The barrier to which this factoid is referring is financial – but it’s more than that. I am no expert but from what I understand, folks in impoverished areas may have limited-to-no access to information (no money for internet or devices or electricity, for example). It’s not just a matter of making tools free to meet this objective, it’s also a matter of actively seeking ways to bridge tools to communities isolated from access.

So how is this measured? Every community is different so one solution isn’t going to meet all demands. I can’t imagine this being viable without working with a non-profit that shares similar goals but has expertise in finding and working with impoverished communities.

The “user expectation” style of objective will not work here since a missing audience can’t provide feedback. The “coverage” style of objective might work here if there is a focus of how to reach communities based on income? Again, I’m not an expert and I don’t need to be one right now. But the idea is that for Cryptiquest’s vision to be realized measures will need to be put in place to ensure that people aren’t cut off from tools due to the limitations of class.

1.4.3 Solutions are discovered to provide tool access to at least __% of communities, globally.

1.4.4 …Regardless of region

This is similar to the previous factoid though on more of a macro-scale. Some regions, nations, and other areas will have barriers due to cultural or legal reasons. The idea is to get everyone access to the tools by crossing cultural barriers while not not breaking laws or sacrificing Cryptiquest values.

1.4.4 Solutions are discovered to provide tool access to at least __% of regions, globally.

1.4.5 …Regardless of societal barriers

This seems like a big catch-all for everything that hadn’t already been covered. And that’s not untrue but it’s more than that. While I could list out what I think this might cover (gender, race, ethnicity, etc.), that list wouldn’t be comprehensive since these barriers are different depending on the community which is being addressed. The citizens of Shanghai, China might face different societal barriers than those in Concórdia, Brazil (I know nothing about these cities except that they are antipodal).

I’m not exactly sure how to handle this one. I mean, my instinct is to say that I should use the “coverage style” of objective but I don’t know what data source exists that shows disenfranchised groups within each community by societal barriers. Perhaps, in an ideal situation, Cryptiquest would fund or work with an organization to study this very thing. I’ll write the objective for now and like all the others, come up with the tool later. Ha.

1.4.5 Solutions are discovered to provide tool access to at least __% of disenfranchised groups, globally.

1.5 The suite of tools includes a game engine.

1.5.1 The game engine is created in multiple formats (including 3d and paper).

The intended purpose of making multiple formats of the game engine (and tools) was to provide options for users and open accessibility. This was not well thought out, however. Perhaps, the objective shouldn’t be about number of formats but ensuring the engine is versatile enough to meet accessibility and outreach demands. This may be solved with the “user expectation” style of objective.

  1. The design process for new tools must include a “game format phase” to consider in what ways the new tool fits into all supported formats.
  2. Feature requests regarding game format have high priority (save for urgent projects).

1.5.1 User expectation regarding game engine format is met at least __% of the time.

1.5.2 The game engine is used to create games and bring content to life.

Okay so what is this trying to say? It sounds more like a description of what the engine rather than a measurement. However, this might be converted into a “user expectation” style objective after all.

  1. The design process for new tools must include a “engine integration phase” to consider in what ways content created by the new tool will come to life in the engine.
  2. Feature requests regarding game integration have high priority (save for urgent projects).

1.5.2 User expectation regarding game engine integration is met at least __% of the time.

2. Media

2.1 The media highlights how the suite of tools works.

The idea here is that if popular media was generated using the tools then the media would serve as proof of concept as well as advertisement. In addition, the tools could then point users toward the media to potentially increase the audience. How is this measured? Should I force Cryptiquest media creators to use CQ tools 100% of the time for every process? I think they should follow project management standards but then they can choose which tools to use for the project during project planning.

  1. The design process for new media must include a “tool usage” phase to consider which tools should be used for the project.

2.1.1 Media follows Cryptiquest project management standards.

2.1.2 Media is generated using Cryptiquest tools (as they are available).

2.2 The media shows the draft process (when applicable).

The idea here is that drafts are revisions of a project and they can be educational to creators.

2.2 Media drafts are saved as versions.

2.3 The media upholds core values.

This one is going to expand to tackle each value (described in Cryptiquest’s value statements):

  1. Inclusive: Content must strive for representation while avoiding tokenism, stereotyping, and cultural appropriation.
  2. Original: Content must be unique and consistent to the world it creates.
  3. Quality: Content must be entertaining, professional, and understandable.

All of these are subjective and will rely on feedback. But how? If one thousand people download an episode of a podcast and people are bored by it would they be moved enough to report on that? Maybe it should be based on reviews. Also, “professionalism” is a bizarre way to measure media, isn’t it? I suppose I used that word because “quality” was already used. I’ll just call it quality. Here are the resulting objectives:

2.3.1 Media reviews report __% positive on inclusivity.

2.3.2 Media reviews report __% positive on uniqueness.

2.3.3 Media reviews report __% positive on consistency.

2.3.4 Media reviews report __% positive on entertainment.

2.3.5 Media reviews report __% positive on quality.

2.3.6 Media reviews report __% positive on concision.

3 Brands

3.1 More than four of the brands are profitable.

This one is already measurable: have fewer than five profitable brands? Vision not complete yet. However, I wonder how this is helpful? I suppose this builds in a mechanism to earning income and also taps into building brands that are successful, which in turn, will help showcase the tools.

I’m not sure how to improve this objective. Perhaps it would be more helpful to the company if this was written relative to terms of coverage. Something like there are a number of brands that cover __% of population…?

3.1 Brands are profitable and popular for at least __% of target audiences, globally.

That one can definitely use some work. What am I trying to say? Brands are considered popular if they are profitable? But if we have 5 brands that are popular but only among a small group of people, it defeats the purpose. Maybe there should be two measures here: popularity vs. profitability.

3.1.1 More than four brands are profitable.

3.1.2 Brands are popular among at least __% of creators, globally.

3.2 The brands span at least three different media types.

This one also is measurable already. I don’t see any reason to tinker with it.

3.2 The brands span at least three different media types.

4 Reputation

4.1 The reputation is supported through positive third party reviews.

This one is kind of weird. How many reviews? Should it be a percentage? Will identifying and collecting total reviews be possible? What if a majority of reviews are positive but the the negative ones garner more attention? Shouldn’t it be more about what people think about us? What sources should be considered “third party?”

There are other objectives that measure what people think about us; this one specifically deals with “third party reviews” which will inevitability happen once the company grows. The goal is to measure success by how positive those reviews are. This serves to increase credibility, which should answer the “what should be considered third party” question: reputable sources.

4.1 Third party reviews of the company are positive __% of the time.

4.2 The reputation is supported through industry awards.

How can you plan for this sort of thing? Isn’t this kind of vain? I think this is one of those “act as if you’re up for an Oscar even if it’s for a deodorant commercial” type of thing. The thing is, if Cryptiquest achieved all of these objectives but this one, would that equate to a vision unfulfilled? No? I didn’t think so. This one is a skip.

4.2 Skip

4.3 The reputation is that of an honest company.

This one is definitely necessary. This one has to be feedback based.

4.3 __% of users consider Cryptiquest is an honest company.

4.4 The reputation is that of a company that treats its employees well.

This one is similar except for one glaring omission.

4.4.1 __% of users think Cryptiquest treats its employees well.

4.4.2 __% of employees think Cryptiquest treats them well.

4.5 The reputation is that of a company that ha a meaningful relationship with its audience.

Again, similar objective:

4.5 __% of users report having a meaningful relationship with Cryptiquest, its brands, or its products.

5 Presence

5.1 Cryptiquest is found when the target audience makes industry-related searches online.

My current thinking behind this is that there is a list of keywords that Cryptiquest desires to be discovered through and that success would be measured on what percentage of those terms returns Cryptiquest.

5.1 Cryptiquest is returned on the first page of results __% of the time when users search for targeted keywords.

I will eventually need to make a list of keywords but that will be addressed with all the other requirements that need to be me addressed for the objectives in a future session.

5.2 Cryptiquest is prominent at major industry events.

This one is important though the definition of “prominent” might be difficult to pin down. This will also require a list of industry events.

5.2.1 Cryptiquest will be present at __% of industry events, annually.

5.2.2 __% of users will report seeing us at events.

I’m not exactly sure when or where these users will be surveyed for this information but that’s for me to worry about in the future.

6 Expertise

6.1 Cryptiquest employs and partners with industry experts.

This is another weird one. How do you objectively judge whether a person or organization is an industry expert or not? Would this be another “__% of users think…” type of objective? That feels like a cop-out. Maybe this should be an internal measurement, as well.

6.1.1 __% of users consider employees of Cryptiquest to be industry experts.

6.1.2 __% of users consider partners of Cryptiquest to be industry experts.

6.1.3 __% of employees consider employees of Cryptiquest to be industry experts.

6.1.4 __% of employees consider partners of Cryptiquest to be industry experts.

6.2 Cryptiquest employs and partners with “thought leaders.”

Same deal as above?

6.2.1 __% of users consider employees of Cryptiquest to be thought leaders.

6.2.2 __% of users consider partners of Cryptiquest to be thought leaders.

6.2.3 __% of employees consider employees of Cryptiquest to be thought leaders.

6.2.4 __% of employees consider partners of Cryptiquest to be thought leaders.

6.3 Cryptiquest is seen as a trusted source of industry-related information.

Okay. So this is the last one. In order to be an industry role-model, you have to be seen as a trusted source of industry information. Not in a “here’s all the industry news” sense but in a “we know what we are talking about” sense. How does this get measured and who should be “seeing” Cryptiquest in this way? If it’s users I suppose it could be measured through surveys but I don’t know how to measure other industry participants’ perceptions. Is it fair to say that if users view Cryptiquest this way that the industry would do so as well? I think so.

6.3 __% of users report Cryptiquest as a trusted source of industry-related information.


Wrap Up


Okay. So 13 hours to work the first pass at building objectives for measuring Cryptiquest’s success. The next steps will involve assessing this list of objectives as a whole, refining them, and determining the tools and requirements for them.

I’ll deal with these action items before I do that though. Once that’s done, I’ll see you in the next session.


Action Items


  1. Add “underestimate of work” to the issue log in Retrospective.
  2. Add the following to the project management documentation:
    • The design process for new tools must include:
      1. A “collaboration assessment phase” to consider in what ways the new tool and old tools can collaborate and how they must change in order to do so.
      2. A “creator concept-to-launch assessment phase” to consider in what ways the new tool will guide creators from concept to launch.
      3. A “share and collaborate assessment phase” to consider in what ways the new tool will enable creators to share and/or collaborate on projects.
      4. A “accessibility assessment phase” to consider in what ways the new tool might abandon users based on ability.
      5. A “game format phase” to consider in what ways the new tool fits into all supported formats.
      6. A “engine integration phase” to consider in what ways content created by the new tool will come to life in the engine.
    • The following types of feature requests will have equally high priority:
      1. Tool collaboration
      2. Process guidance
      3. Collaboration and sharing
      4. Accessibility
      5. Game format
      6. Game integration
    • The design process for new media must include a “tool usage” phase to consider which tools should be used for the project.
Session 01: Kick-off
Session 03: Assessing Objectives