top of page

Testing yourself- Milestones in KM projects


A light bulb on top of an open book

When we begin a Knowledge Project, we set goals and objectives. However, when the work routine kicks in we occasionally forget why we initially embarked on this journey; at some point, the reason we started has become irrelevant. How can we make sure that we're on the way to attaining our goals or that our goal is the correct one? We must perform essence, profitability and usability reviews as milestones.


Essence, usability and profitability reviews are a sort of intra-organizational examination meant to test the nature and degree of use of current or future Knowledge Management tools (organizational portal, knowledge directory, insight database, professional desktop, etc.). These reviews are the "pulse meter" of KM tools through which organizations can receive information regarding the tool's effectiveness, its degree of user-friendliness and the users' consumption habits, and as a result perform changes to improve the tool and adapt it to the organization's changing needs. For example, Essence, profitability and usability reviews of an organizational portal can lead to receiving information on the amount of entrances to a website and which pages are most viewed. This information may lead to an "organizational campaign" that will encourage using the website, updating the website, deleting or adding pages, etc.


Is it even worthwhile to perform essence, profitability, and usability review? I suppose that the immediate answer would be an "obviously!" Despite the importance of these reviews, I tend to answer less decisively: it really depends. These reviews are actually feedback from users regarding the Knowledge Management tools. Feedback, by definition, can rouse "tacit resistance" when received. It is tacit since if we perform essence, profitability and usability reviews we are interested in receiving this feedback yet when the "moment of truth" arrives it is difficult to accept criticism, especially when we invest much effort doing our job and are coerced to change our work habits. The essence, profitability and usability reviews will prove effective only if those performing it (as well as those that requested for it to be performed) is open to listen, receive the feedback and learn from it. If it's all a show, if it's all about doing what "looks right and must be done sometimes", I'd pass.


So, before you continue reading take a moment and think: are you and your organizational partners really open to receive feedback, learn from it, and (even) make changes when required? If your answer is "yes", keep on reading; if your answer is "no", put this article away; you might need it in the future.


When will we perform essence, profitability and usability reviews?

Essence, profitability and usability reviews can assist us during each stage of a KM project or ongoing routine. Each stage will require a different goal and possibly method of review according to its settings:

  • Initiation stage: before we begin upgrading/setting up Knowledge Management tools we wish to map and define initial needs (essence and profitability review). This type of review is sometimes referred to as a "scouting review" meant to locate a basis for next phases. In our case, an essence, profitability and usability review will lead to a KM solution tailored to its needs and settings.

  • The functional analysis stage: after selecting a KM solution and staring the process, we must review some more in order to collect in-depth information on the organization's needs and user profile to assemble the characterization (mapping target audiences, user preferences, system operation habits, etc.) after assembling the initial characterization, we must perform a usability review to validate it and ensure that it matches the business goals defined for the project as well organizational needs.

  • The implementation stage: this stage usually requires a PILOT (a type of essence, profitability and usability) to ensure that the KM tool that we implemented is indeed used and answers the needs it is meant to answer. We should also detect strong and weak points and perform the required changes.


How should we perform usability reviews?

Essence, profitability and usability reviews are types of intra-organizational examinations meant to evaluate the degree and nature of usage of KM tools. Therefore, when performing a review, we are better off using conventional methods.

  • Quantitative: a quantitative review involves calculating quantitative data and using statistical and probability methods to map and even forecast processes, trends and patterns.


The most commonly used methods regarding Knowledge Management tools are usability reports produced from the system (e.g. number of users, number of entrances to a website/page, time spent on each page, pages viewed most, etc.) and multiple-choice surveys/questionnaires.

  • Qualitative: a qualitative review does not rely statistical analysis and numerical data, rather focuses on identifying central themes that typifies human behavior (in our case, consumptive habits and users' preference regarding Knowledge Management tools). The most commonly used methods are focus groups, open-question surveys/questionnaires and personal encounters.


Using the quantitative method is easier compared to implementing the qualitative method regarding its process and data analysis. However, the information that the quantitative method can produce from the data is limited. The quantitative provide information on the "what" or "how much", yet do not provide any information regarding the "why" or "how". These questions require qualitative methods that add "color" to dry data. However, we must remember that a qualitative method requires greater resources and its conclusions are more subjective. The two methods are therefore complementary; in many cases, it is recommended to first use qualitative methods then continue by using qualitative methods for in-depth learning of the received qualitative data. For example, we can receive statistical data on the website pages viewed most and pages hardly viewed (qualitative method), then hold interviews/focus groups to understand why users prefer the viewed pages over those not viewed (qualitative method). Combining these methods obviously depends on the availability of apt resources.


What method should we choose?

The selected method must suit the need that the review was meant to answer. If we wish to receive quantitative data like number of entrances or pages viewed most, we should use statistical reports. However, if we want to "get in the user's head" we should choose qualitative methods.


The selected method must suit the organization and the people performing the review. An organization that attributes great importance to quantitative data will probably not appreciate a qualitative review.


Furthermore, the review must be performed by professionals that know how to set up the reviewing array and analyze the data well. Performing a quantitative review requires skills such as writing multiple-choice questions and an understanding of statistics; performing a qualitative review requires skills such as directing focus groups and in-depth interviewing skills.

The selected method must suit the limits of the resources allocated for this review, both time and cost.


What do we do with all this data?

After producing the data, we must analyze it (each method using a different analysis method) and understand its meaning. The most important products of this process are operative recommendations for future cases which mustn't be left on paper, rather be implemented in the organization.


In conclusion, essence, profitability and usability review, at any stage and using any method, will provide us with priceless information if only we remain open to receiving feedback and new ideas.

I'm not saying it is simple; sometimes we are so busy at work, we've grown accustomed to a certain concept or method and cannot possibly think "out of the box". My experience taught me that the most interesting and important feedback come from the most unexpected sources…


 

Want to learn more about professional knowledge development?

Here are some articles you might find interesting:

1 view0 comments

Comments


bottom of page