|Once a month I get together for breakfast with the Knowledge Directors from several organizations in the Toronto area. We have wonderful, far-ranging discussions about knowledge management, social networking and business innovation. One of the topics that came up last month was the various risks of having knowledge (theft, destruction, misuse, violation of customer confidentiality, violation of intellectual property laws etc.). That got me thinking about the opposite risk, the risk that “you don’t know what you don’t know”: The risk, and cost, of not knowing.
What are the consequences of operating a business with incomplete and imperfect information?
If it were possible to quantify precisely this ‘cost of not knowing’ (it isn’t), there would be some break-even point (see chart above) at which the cost of not knowing equals the cost (and risk) of having knowledge, and that would determine exacly how much knowledge your company should acquire, make available, and deploy. Although it’s impossible to be this precise, many organizations would benefit from being a little more disciplined in assessing the costs, and risks, of having vs. not having knowledge, so they at least get it approximately right.
Once the right quantum of knowledge has been decided on, and processes are in place to acquire and deploy it, knowledge managers need to monitor its quality, value, timeliness and use. This task of content management goes on at two levels: at the organizational level, for centrally managed content, and at the individual ‘desktop’ level, for personally managed content. Content management entails the following measurements, assessments and interventions:
Under the guidance of Colin McFarlane, Ernst & Young recently pioneered a Content Rationalization program to solve the problems of stale, obsolete, and hard to find information. By looking at each database and Intranet website, and talking to users, his team categorized all content into four quadrants of this 2×2 chart:
E&Y has also developed two documents, which every employee must sign, that help ensure the security, effective use and confidentiality of the firm’s content. The Appropriate Use Policy document outlines what is considered proper use of the firm’s knowledge and technology tools, with a focus on security and integrity. The Knowledge Sharing Agreement categorizes all firm knowledge into five types, ranging from strictly confidential (no sharing allowed) to open-use (unlimited sharing inside and outside the firm), with specific examples of each. It also carefully explains the trade-off between protecting client-confidential information and the obligation to share knowledge as broadly as possible.
But back to the issue of not knowing. In the breakfasts of our Toronto Knowledge Directors group, we’ve concluded that the break-even point in the top chart above, the point at which the cost and risk of not knowing drops below the cost and risk of acquiring and managing a lot of content, falls at different points in different organizations, and that this break-even point is a function of both the organization’s industry and its knowledge culture.
As an example, half of our members had decided not to deploy instant messaging technologies in their organizations, because the perceived risk of misuse, hacking or leaks of sensitive information was too high. But for the other half of our members, the critical need for constant consultation within the organization, the need to get objective second opinions on critical judgements made in every assignment, resulted in the decision to deploy IM, because the risks of misuse, hacking and leaks were deemed lower than the risks of insufficient consultation. In these companies, the risk of not knowing was recognized as being high, moving the break-even point to the right and justifying both the cost of deploying IM and the cost of ensuring its security.
As with everything else in KM, there’s no one right answer, no ‘best practice’ that applies to everyone.
Our group would be interested in knowing how other organizations assess the costs and risks of not knowing, how they rationalize content, and how they assess and address the content problems in the table above. If you’re aware of how your organization handles these issues, we’d love to have a conversation with you.
And I wonder whether governments have formal processes for assessing the costs and risks of not knowing. I’m sure some weapons inspectors would be curious about that, too.