Date Published April 12, 2018 - Last Updated December 13, 2018
You have accepted the challenge to improve your knowledge management initiative. But before you charge in with your ideas and lay out your plan, assess the current state. Talk with the various stakeholders. Learn from them what they think is working well, what is not, and what ideas they have that will help you with your mission. Read the strategic plans and process documentation. These can help define the current objectives and expected results. And evaluate the current set of metrics and measurements. Discover the current gap between how the organization is performing today versus the desired level of performance or your vision for the organization. Then ask yourself, “Am I looking at the right metrics?” It is possible that the organization’s metrics are ticket focused and were not updated with a knowledge perspective. What metrics and measurements do you need?
The answer to this question is “It depends.” It depends on the maturity of your knowledge management adoption. Are you focusing on creating the knowledge sharing culture and getting people to capture, improve, and reuse knowledge? Are you focusing on assisted service only? Or has the organization implemented self-service and you need to improve self-service success? It depends on the extent to which knowledge is being integrated into other service management processes beyond incident management and request fulfillment. How has knowledge impacted service level management, problem management, change management, and release and deployment management? And it depends on the business objectives. What was the business justification for investing in knowledge management?
There is no single metric that can represent the health of your knowledge management initiative. You will need to monitor and trend multiple metrics. Some metrics should have targeted goals. These are the lagging indicators or results. The leading indicators, or activities, are the metrics that can be used to predict changes to future results and allow you to make timely adjustments. All must be monitored and trended for expected and unexpected changes. It is the collection of the right metrics that can represent the health of your knowledge management initiative.
Business Impact
The reason you have been asked to improve the knowledge management initiative is to improve the business. Your primary objective is not to improve the quality of knowledge. Before you begin to focus on improving the knowledge processes and practices, you need to know why. Why is knowledge important to the business? When you first answer that question, ask why again and again and again until the answer ties to the business objectives. For example, when asked why you want to improve knowledge management, the answer given might be to improve knowledge sharing. You need to ask why again.
- Why? To improve the quality of knowledge
- Why? To improve the first contact resolution rate
- Why? To reduce the mean time to resolution
- Why? To improve customer satisfaction and productivity
- Why? To improve customer loyalty and reduce the cost per incident
- Why? To increase sales and improve profits
This technique is known as the Five Whys. But it is not limited to five and may not require you to ask why five times. The key is to explore the root cause or purpose of your objective.
Other stories might include:
- Improve employee performance to improve employee satisfaction to reduce employee turnover to reduce new hire costs to improve profits.
- Reduce the average handle time to improve analyst capacity, to lower cost per incident, to improve profits.
- Increase self-service, to reduce call volume, to reduce abandon rate, to improve customer satisfaction
Once you know your stories, then you have begun to identify several metrics that are important for you to measure and report on to your stakeholders.
Assisted Service
Most knowledge management initiatives begin by focusing on improving assisted service. This is where knowledge is captured, improved, and reused within service management processes like incident management. Support professionals use existing knowledge to resolve incidents that are known to the organization, where known implies that the organization has previously resolved the incident and captured the issue and resolution as a knowledge asset. If the incident is unknown, then the support professionals must work to develop a resolution. Once they resolve the incident for the customer, they need to capture the new knowledge as a knowledge article for future reuse.
The most common metrics organizations will implement for monitoring the knowledge process are:
-
Link rate (or participation rate). The percent of tickets closed with knowledge either reused or created. This implies that the proper process was followed.
-
Create rate. The percent of tickets closed where a new knowledge article was created.
-
Reuse rate. The percent of tickets closed where existing knowledge was used.
-
Modify rate. The percent of tickets closed using existing knowledge where the support professional either updated the article or added comments to improve it.
These metrics measure activities related to knowledge. It is a start, but it is not good enough. Focusing on the quantity of knowledge articles can result in garbage in the knowledge base and linking for the sake of linking resulting in bad data. Activity metrics need to be complemented with quality metrics from the quality assurance processes.
Focusing on the quantity of knowledge articles can result in garbage in the knowledge base.
-
Knowledge quality (or article quality index) is a result of a knowledge monitoring process where trained coaches review random samples of contributed knowledge articles against a set of knowledge quality criteria. A score is calculated for each person for a defined period, and the coaches provide the knowledge authors with feedback to improve their skills.
-
Link quality (or link accuracy) is the percent of tickets where the appropriate knowledge article was reused. This metric is a result of a ticket monitoring process where the coaches review a random sample of tickets closed against a set of ticket quality criteria. The ticket criteria must be updated when knowledge is integrated into the process to check that the proper article was linked to the ticket.
If you’re implementing the Knowledge Centered Service (KCS) methodology, then you can evaluate the competency profile. Those metrics define the percent of staff that have earned the different competency levels or roles: Candidate, Contributor, and Publisher.
The combination of activity metrics and quality metrics can be used to monitor individual performance and team performance. When trended and monitored along with the business impact metrics, a correlation can then be identified to show the value of adopting a knowledge sharing culture.
Self-Service
When your organization begins offering customers direct access to knowledge, new metrics are needed. Some of these will be self-service metrics while others relate to assisted service. The following assisted service metrics are important to monitor.
-
Time to publish. The average time from when a knowledge article is created until it is available via self-service. The objective is to reduce this time and allow customers access to new knowledge as quickly as possible.
-
Percent published. The percent of knowledge articles available via self-service. The KCS v6 Practices Guide promotes that 90% of new knowledge should be made available to self-service immediately.
-
Known ratio. The percent of tickets resolved using existing knowledge. This is the theoretical maximum percentage of ticket volume that could potentially be resolved via self-service. A shift-left strategy will strive to move more knowledge articles to self-service.
-
Level zero solvable. The percent of tickets resolved on first contact using knowledge articles that are published to self-service. This is the percent of ticket volume that could be moved to self-service if customer adoption is increased. Tickets that were not resolved on first contact may imply a findability issue with the knowledge and even though it is published to self-service, customer success in finding the article is in question.
Measuring self-service success is not as easy as assisted service. Assisted service focus on metrics related to tickets, which is the sum of all work if your organization logs all requests for assistance as tickets. Tickets are managed and closed, and easily counted. Success and failure is known and can be analyzed.
In self-service, customers may or may not log in to access the knowledge base. They most likely are accessing through a browser, and them closing a browser is not an indication of success or failure. Research has shown that a very small percentage of customers will complete a survey after reading a knowledge article. If they find knowledge that is helpful, they close the browser or navigate away from the page. Given this type of environment, what metrics can help evaluate the success of self-service?
-
Unique sessions. The number of visits to your knowledge base. A trend of this metric can indicate the change in use of your self-service portal, positive or negative.
-
Web tickets (from self service). The number of tickets created by customers while using the self-service portal. This can indicate that customers cannot find helpful knowledge articles. The percent of web tickets to unique sessions indicates a minimum of self-service knowledge failure. If 40% of sessions results in a web ticket, then at least 40% of attempts at self-service did not result in a customer finding helpful knowledge. The results of the other 60% are unknown. You do not know if the customer found what they were looking for or if they just stopped looking.
-
Self-service success rate. Since you cannot determine this directly, some organizations have turned to asking the customer during a periodic customer survey. If you have used the self-service knowledge base, what percent of visits resulted in you finding helpful knowledge? Then average the answers to calculate a self-service success rate. The customers are guessing, so accuracy is questionable. But having this metric is very valuable.
-
Ticket deflection (or self-service tickets). The number of tickets not created through assisted service due to self-service success. Total tickets deflected equals the unique sessions times the self-service success rate. In addition to the tickets managed by the support organization, you need to take credit for the ticket deflection as knowledge was delivered to satisfy a customer need.
Another indication of self-service success is the impact on assisted service. The correlation can be a result of causation if the primary changes in the environment relate to self-service availability and promotion.
-
Ticket volume should reduce as self-service unique sessions increase.
-
First contact resolution rate should decrease as the easy-to-resolve tickets using knowledge in assisted service are shifted to customer self-service success.
-
Average handle time should increase as the easy- and quick-to-resolve tickets using knowledge shift to self-service, leaving a high percentage of challenging tickets for assisted service.
-
Known ratio will reduce as known issues shift to self-service.
-
Level zero solvable will reduce as known issues shift to self-service.
-
Cost per ticket will increase as the ticket volume decreases and the complexity of tickets resolved increases.
Implementing a self-service knowledge base can have tremendous value for customers and the organization. Unfortunately, there are no metrics that directly and accurately measure this value.
Product Improvement
The value of knowledge is not limited to assisted service and self-service. The knowledge about the knowledge articles improves as the knowledge sharing culture matures and the link rate increases. Problems can be detected leading to product and service improvements by evaluating the reuse of knowledge articles. The number of problems detected and reported to problem management is a new metric related to proactive problem management.
The list of metrics shared here is not meant to be exclusive. It is meant to serve as a starting point. You need to be measuring and trending the right metrics to determine how to improve your knowledge management initiative. These metrics can indicate where further investigation and improvements are needed as well as where there is evidence of success that can be celebrated and expanded.
Rick Joslin has more than 30 years of information technology experience. He has led software development teams and technical support organizations and has provided consulting services to several organizations. Rick has more than 20 years of experience in knowledge management and is recognized internationally as an expert in KCS. Rick holds a BS in computer science and multiple certifications from HDI, the KCS Academy, and AXELOS. He served as HDI’s Executive Director of Certification and Training for 10 years and is currently a 2018 Featured Contributor for HDI. Connect with Rick on LinkedIn.