What Voting Needs: KM and Better Design

Election 2012 Voting on Tuesday was an unforgettable experience. We were welcomed at the door of the neighborhood elementary school by a cheerful poll worker who wished us a good morning, ushered us into the building and then carefully directed us to the school cafeteria where the voting booths were located.

That was the last time we received clear and easy-to-follow instructions.

Now don’t get me wrong. Everyone was polite and kind. A few were even downright jolly. But  some poll workers — and many voters — clearly were a little confused.

What was so confusing?

  • There were several bits of paper to process and lists to check before I was handed a ballot.
  • The lines for the voting booths and the scanning stations snaked around the room haphazardly and generally seemed disorganized.
  • The path for the voter was neither clear nor direct. There was lots of bobbing and weaving as we tried to stay out of each other’s way in the process of stumbling from one step to the next.
  • The ballot itself was long and involved. I’m a native English speaker and have been reasonably well educated, but I had to pay attention in order to complete the ballot properly. What happened to voters with a more tenuous grasp of English?
  • The process was paper-intensive, but resulted in a digital output.  Then why so much paper?
  • There were lots of rules, but they seemed extraneous to the core job of completing a ballot and scanning it. Nonetheless, the poll workers were diligent in enforcing rules they probably would be hard pressed to explain (much less justify).

Now that I’ve had a few days to think about it, I really can’t blame the poll workers. After all, it wasn’t as if they were doing something they had done a hundred times before or even within the last year. To be honest if you asked me to do something once every four years, I’m not sure I’d get it right every time. When you have a process like this that is infrequent, but must be carried out  reliably in a consistent fashion, you have a process that is in desperate need of a well-documented practice guide. In fact, the knowledge management professional in me was dying to offer to stand there, observe how they worked, find a little positive deviance, and then write up a practice guide that they could use later to prepare for a better voting experience in the 2016 election.

Later I discovered that I wasn’t the only one who viewed the voting experience through the particular lens of their own profession. If you look at commentary in the user experience community, you’ll find no shortage of criticism of the poor design that resulted in a suboptimal voting user experience in several places, including New York and Chicago. To be fair, voting presents a significant usability challenge. As Whitney Quesenbery observes:

Voting may be one of the most difficult usability challenges because it is a task completed by virtually anyone, it is done infrequently, it is never exactly the same because the actual ballot differs for each election, and privacy requirements make it difficult for voters to seek help in using the voting system.

Voting on Tuesday was unforgettable. The experience of standing peacefully next to our neighbors to exercise our rights as citizens is something we should never take for granted. That said, I’m in the innovation and improvement business and can’t help seeing opportunities to make the experience better for voters and poll workers alike. In my humble opinion a little more attention to design and knowledge management could have vastly improved the voting user experience.

So let me end with a question for you: Are the processes within your organization well-designed and supported by helpful practice guides or do they resemble the voting user experience?

[Photo Credit: League of Women Voters of California]


We All Need Training

Hand washing poster from Yale Do you know how to wash your hands? Now, before you complain about bloggers who ask dumb questions, let me rephrase that question slightly: Do you know how to wash your hands properly? Chances are you don’t.

This issue arose when I found myself getting frustrated by restaurants that piously posted signs in restrooms instructing employees to wash their hands carefully, yet those same restaurants refused to provide hot water for hand washing.  How on earth could that be hygienic? This set me down the path of learning more about hand washing. Although I’m a scrupulous hand washer, I soon discovered that I had a lot to learn about the mechanics of hand washing. Among the things I learned are the following:

  • While hot water is nice, it’s not necessary. If you were serious about using water temperature to blitz the bacteria on your hands, the water would have to be scalding hot.
  • The key to effective hand washing is friction — it’s the rubbing of one soapy hand against the other that dislodges the oil that holds the dirt and bacteria on your skin.
  • According to the Center for Disease Control, you must scrub your hands for at least 20 seconds to clean them properly. How long is 20 seconds? The time it takes to sing “Happy Birthday to You” twice.

So clearly, even after a lifetime of diligent hand washing, I need to go back to hand washing school. What about you?

Now even if you do better than I do on the hand washing test, how are your hand drying skills?  (I can hear you asking yourself, is she crazy?  How hard can it be to dry your own hands???!) Bear with me a moment.  Even if you know the basics of how to make wet hands dry, do you know the best method for every context?  For example, what’s the best way to dry your hands if you’re trying to keep your hands germ free? What’s better: a cloth towel, a paper towel or one of those jet air dryers?  (Hint: it may not be the jet air dryer.)

What if you only have paper towels to dry with? Doesn’t that damage the environment? Is there a way to dry your hands and protect the environment? It took an entertaining TED talk by Joe Smith to show me how to dry my hands without ever needing more than a single paper towel.

This foray through hand washing and drying is intended to illustrate a larger point. If we still have much to learn about tasks we’ve performed nearly every day of our lives, why do we believe we don’t need ongoing training for the tasks we perform at work? Technology changes, contexts vary, best practices improve.  Are you confident that you have learned and incorporated the latest training into your work?  If not, why not?

The next time you wash and dry your hands, consider what other areas of your life could benefit from a refresher course.  We all need training.

[Photo Credit: Patrick J. Lynch]


Fixing the Weak Link

LEEDM.E.1967.0032.G.13 He thought they were going to have a quiet dinner, but she arrived apoplectic.  Some of the folks who worked with her had dropped the ball on a project that was routine and should have been foolproof.

She thought they were going to have a quiet dinner, but he arrived apoplectic.  Someone had asked his help on a project, but frequently failed to send him the necessary documents — even when those documents were mentioned in the transmittal note.

This couple is headed for high blood pressure problems or, at the very least, indigestion.  I suspect they are not the only ones.

Why do simple things get messed up? Look for the weak link.  In the first instance, the weak link was between two parts of the organization that were handling a job together.  Because it was a collective effort, nobody felt responsible because everyone was (theoretically) responsible.  In the second case, the weak link lay in the person transmitting the documents carelessly.

Fixing the weak link is tough because by the time you confront it, you’re often in a towering rage.  So, the first step is to sleep on it.  If that’s not possible, at least count to 10 before commencing. Then, take a look at the procedure surrounding the weak link.  In the first case, each part of the organization had a checklist for handling their part of the process.  However, someone failed to follow the checklist. And the organization had not created a checklist to cover the handoff.  This handoff checklist could have acted as a secondary check, another chance to catch a error before it developed into a real problem. In the second case, the problem arose in…the handoff between the person sending the documents and the recipient.  They clearly did not have a checklist that the first person could follow to ensure that all relevant materials were sent to the recipient when promised.

Peter Bregman believes that problems arise in the handoff phase because of poor communication:

Most of us think we communicate well. Which, ironically, is why we often leave out important information (we believe others already know it). Or fail to be specific about something (we think others already understand it). Or resist clarifying (we don’t want to insult other people).

To address this problem, Bregman recommends that we develop and use a handoff checklist along the following lines:

Handoff Checklist

  • What do you understand the priorities to be?
  • What concerns or ideas do you have that have not already been mentioned?
  • What are your key next steps, and by when do you plan to accomplish them?
  • What do you need from me in order to be successful?
  • Are there any key contingencies we should plan for now?
  • When will we next check-in on progress/issues?
  • Who else needs to know our plans, and how will we communicate them?

Time it takes to go through the checklist? One to five minutes. Time (and trust) saved by going through the checklist? Immeasurable.

If you’re tempted to ignore the need for a handoff checklist or a checklist of any sort, take a few minutes to read this collection of sad (and in some cases, scary) stories of what happens when people fail to create or follow checklists.  If you want to learn more about checklists, read my prior post, The Value of Checklists.

At the end of the day, it’s hard to escape the conclusion that taking the time to develop and follow a checklist that addresses the weak link can save lives, save time and possibly save you from high blood pressure and indigestion.


When Knowledge Management Saves Lives

If there are days when you doubt the value of knowledge management, take a closer look at Project ECHO:  it saves lives by sharing specialist knowledge from teaching hospitals with a wide network of primary care physicians in far-flung areas.  As a a result, the patients in those areas get the benefit of cutting edge medical treatment without having to travel hundreds of miles to academic centers.

Founded by Dr. Sanjeev Arora and his colleagues at the University of New Mexico, Extension for Community Healthcare Outcomes (Project ECHO) has become a shining model for innovative medical practices and for KM.  Here’s the back story:

In 2003, nearly 30,000 New Mexicans were infected with Hepatitis C, yet only 5 percent were able to access treatment which is available almost exclusively through specialists at the University of New Mexico (UNM) in Albuquerque. The plight of these underserved patients inspired Sanjeev Arora, one of the top Hep C specialists in the country to develop a plan to deliver state of the art treatment to these communities through Project ECHO (Extension for Community Healthcare Outcomes).

Project ECHO creates a one-to-many “knowledge network” of specialists and … rural providers, who meet by videoconference to co-manage specific patients and share two-way teachings in which the ECHO staff works with remote clinics to coordinate and educate. Sanjeev calls this aspect of ECHO the “workforce multiplier.”  Through the “knowledge networks” of the clinics, specialists co-manage patients and teach rural medical professionals to be mini-specialists, to whom patients from that area are increasingly referred, This eventually saturates the state with the ability to treat Hep C and also helps deconstruct stereotypes and prejudices that often have existed between specialists and providers.

By pushing the ability to treat chronic, complex diseases down the work chain, ECHO is not only bringing specialized treatment to thousands of patients who would have otherwise gone untreated, but it is also keeping remote providers where they are most needed. Retention rates for rural medical professionals in New Mexico are notoriously low, and Sanjeev’s work is changing this by empowering isolated providers with stimulating, practical, cost-effective continuing education.

The key components of Project ECHO are:

  • Use technology to leverage and share scarce specialist knowledge through knowledge networks
  • Create best practice protocols for treating complex diseases and then share the protocols with primary care clinicians
  • Specialists in academic centers mentor physicians in rural areas using the same case-based learning these doctors learned in medical school
    • through videoconferences, groups of rural physicians hold “virtual rounds” in which they present cases and collaborate with academic and rural colleagues to identify the best course of treatment
    • these sessions build communities of practice and facilitate knowledge sharing, thereby spreading expertise across the state
  • Use the internet to track outcomes in order to have the metrics necessary to establish ROI on the program
  • Knowledge sharing + mentoring + technology act together as a “force multiplier” for the delivery of high-quality services

While you may not have the responsibility for saving lives in your daily work, Project ECHO is a wonderful reminder that smart KM together with good technology can have a transformative effect.  Remember that on the days when you find yourself struggling with KM skeptics.


Here are some brief videos that will tell you more about the impressive work of Project ECHO:

Project ECHO: Spreading Access to Quality Healthcare:

Project ECHO:

TEDMED Q&A with Dr. Sanjeev Arora, Project ECHO Director:


The Value of Checklists

Let’s start with the premise that you’re fantastic. In fact, you’re well-trained, experienced and routinely exhibit good judgment. So, do you need a checklist? Ask a pilot or a surgeon.  Surgeon Atul Gawande did exactly that and learned some interesting — and sobering — things.  In a recent interview, he discussed his latest book, The Checklist Manifesto: How to Get Things Right, which recounts his exploration of the value of checklists.  Time and again, he found that checklists were an effective antidote to ignorance, uncertainty and complexity.  He and his team developed a two-minute checklist that covered some basics for surgery (e.g., do we have enough blood and antibiotics?), as well as some basics for good teamwork (e.g., does everyone in the Operating Room know the name of each person in the room?).  They then tested these lists in eight different hospitals.  The results were stunning.  For example, when they took the time to make introductions, they had a 35% decline in deaths and complications related to surgery.

Creating checklists for routine procedures makes sense.  They allow you to act quickly and confidently.  Creating checklists for complex situations are even more important since these are precisely the times when you are most beset by uncertainty and may not even know what you don’t know.  In these cases, it’s helpful to have a checklist that can help pin down facts and eliminate areas of concern.

After the trial period in eight hospitals, 80% of the surgeons involved said they would continue to use the checklist.  Interestingly, 20% remained resistant.  They believed that the checklists were a waste of time and didn’t add value. However, when asked if you were having an operation, would you want your surgeon to use the checklist, 94% of those resisters said they would.

So why are professionals resistant to checklists?  Atul Gawande thinks that this is because experts have a hard time admitting their own fallibility.  There are also experts (be they lawyers or knowledge managers) who approach their work as “artistes.”  Therefore, they believe their creative outflow cannot be reduced to a dry checklist.  Finally, there are the thousands of us who race through our days just struggling to get things done.  In the press of business, it is hard to take the time to stop and reflect on what works and what doesn’t.  It’s harder still to take the time to document it.  Tragically, when an error or accident happens, we are forced to stop and think about what went wrong.  Under those circumstances, the analysis is charged, value-laden and painful for all concerned.

Is there a two-minute checklist you could develop this week that might help strengthen your work flow or work product?  If so, can you afford not to make the investment of time required to create that checklist?

[Photo Credit:  Adam Sacco]


Dating a Beautiful Model

Some folks aspire to be as beautiful as a model, while other folks aspire to have a model on their arm. Lawyers, by contrast, aspire to have a collection of models. Model DOCUMENTS, that is.  And, as long as lawyers want model documents, law firm knowledge management personnel are going to try to find ways to provide them.  But, should they?

As mentioned in my earlier post, KM’s Worst Enemy, model documents represent a massive investment for a firm because it is very hard to throw a model document together overnight. If you’re going to do it correctly, you’ll have to spend time and effort to create a model that meets both your practice quality and risk management needs. Ideally, model document drafting will incorporate the experience of several lawyers in the relevant practice.  This means that law firm KM personnel must recruit and retain lawyers to help with the drafting.

At a recent meeting of law firm knowledge managers, I asked how many of them had successfully recruited under-utilized lawyers in their firm to update their collection of models.  The responses were consistent and discouraging.  Even when lawyers have a lighter billable workload, they tend to be disinclined to assist with drafting or updating model documents.  The solution for some firms has been to recruit practice support lawyers who work on a nonbillable basis to generate these materials.  However, this approach has its own challenges and, to do it correctly, you may well need a team of support lawyers who have expertise in a wide range of practice areas.  The solution for other firms has been to obtain model documents from traditional legal publishers or subscribe to the resources offered by practice support companies such as the Practical Law Company.**  PLC takes care of the drafting and updating, which is a huge improvement over what many firms can do for themselves.  Each subscribing firm then trains its lawyers to use these materials in a manner that is appropriate for that firm’s practice and clients.  And, of course, that firm has to pay a subscription fee for the service.

Before thinking about generating models internally or obtaining them externally, it would be worth examining further how many models you really need.  Many firms assume that a model constitutes a statement of best practices and, therefore, the more models you have the better.  A recent interchange on best practices with Tom Young of Knoco sharpened my understanding of best practices.  For the purposes of this discussion, I’d draw your attention to his concepts of standardization and innovation.   In applying this to the law firm context I wonder whether we would be wiser to concentrate on creating (or obtaining) model documents only for those instances where it is imperative that we ensure standardization.  In all other cases, would it be a better use of firm resources to produce a practice guide, checklist or issues list rather than a full-blown model document?  Until you consider these questions in the context of your firm’s practice, you may find yourself frustrated or disappointed as you try to find new and creative ways to coax your colleagues into creating models.

Dating a beautiful model may be your dream, but in a law firm it comes at a price.


Further Reading on Best Practices:

**Disclosure:  I’m a member of the Practical Law Company’s Advisory Board.

[Photo Credit:  UltimateGraphics http://www.flickr.com/photos/29956195@N08/ / CC BY-NC 2.0 ]


Resting on Your Laurels Ruins Best Practices

Yesterday’s post, Just Tell Me What Works, discussed some of the weaknesses arising from a blind obsession with best practices. The chief weakness is the false belief that someone else’s solution will work perfectly for you.  But what if you avoid that weakness and actually do the hard work of thinking for yourself in order to create a definitive statement of a best practice in your context.  Are you done? As Joe Firestone reminded me today, unfortunately not.

Once you’ve successfully created a best practices (or next practices) document, it’s tempting to breathe a big sigh of relief, celebrate your accomplishment and then rest on your laurels.  With the passage of enough time, however, you end up with a moth-eaten collection of practices that are interesting primarily for historical purposes.  Your much vaunted “best practices” are now woefully out of date and may even be dangerous from a risk management perspective.

So what’s the solution?  Under the old model, you would ask the chief author of each best practices document to assume responsibility for updating the document as necessary.  Unfortunately, busy schedules (and disinterested authors) can make this difficult.  Yet, we’ve persisted in pursuing this model because it allowed the author to maintain control over a resource that was considered too important to have distributed authorship.  So, you focus on perfect control and get imperfect content.

A related problem with this approach has been identified by Joe Firestone and Steven Cavaleri as a gap between the claims of a best practices document and its track record.  Have those best practices been tested?  Have they passed the test?  If so, is that reflected in the record?  If not, what improvements are necessary?  To do this effectively, you need different folks interacting with the best practices document over time and reporting their results.  This can be done through the imaginative use of Enterprise 2.0 tools (e.g., collaborative tags and annotations), but it does require a willingness to relinquish a measure of control.

For best practice documents and all the other “solutions” promoted by the  firm, Firestone and Cavaleri advocate building a living and breathing knowledge base that provides current information and promotes innovation:

For flexibility and variety, the real knowledge bases we have in mind, ought to be distributed, rather than centralized, and Enterprise 2.0 and 3.0 technology including tagging, annotating, and mashups, and new semantic web applications, should be applied to create both a new and richer layer of meaning and integration across stove pipes. To be effective in creating high quality knowledge bases that will be most useful in enhancing thinking up new ideas, social computing technology must be applied both collaboratively, and in a way that includes all ideas, no matter how new and untested they are. The rule should be to let the knowledge base reflect the track record of performance of ideas comprising solutions, or the absence of such a track record, and leave it up to people to factor that into their own creative thinking.

At the end of the day, identifying best practices is the first rather than the final step.  You then have to test them regularly for currency.  If you give into the temptation to rest on your laurels, you’ll quickly turn those best practices documents into quaint historical artifacts.  Now, please explain to me how that helps your firm manage risk?

[Photo Credit:  Elizabeth Thomsen]


Just Tell Me What Works!

Sometimes we just want to be told what to do. To be honest, we all have days when that seems far preferable to thinking for ourselves. Unfortunately, it’s exactly this temptation that has led us to make a fetish of “best practices” in knowledge management.  However, we would do ourselves a great favor if we were more candid about the real value of best practices.

In his October Newsletter, David Gurteen includes a great piece entitled On Best Practice and Thinking for Yourself! In it he explains why slavishly following so-called “best practice” may not always be the right approach.  In fact, best practice may sometimes be illusory.  Best practices are, in theory, a wonderful thing.  After all, who wouldn’t want to know how the best and the brightest do something?  The problem is that the solution those exceptional folks have found works precisely because it is their solution.  It succeeds because it was created for their context and was carried out by them.  Unless you are operating under exactly the same circumstances (and with the same type of people), there is no guarantee that it will work equally as well when you try to make it your solution.

The sources David Gurteen cites point to the true value of “best practices.”  That value doesn’t lie in having a foolproof recipe.  Rather, those “best practices” are most useful as examples of what can be done (rather than what must be done) to address a specific situation.  You could then take those examples and adapt them to the particularities of your situation.  Better yet, you should take those examples and use them as a launching point to spur some truly creative thinking on your part and devise a solution that is uniquely suited to your circumstances.  That creative thinking should lead you to Next Practices rather than Best Practices.  And, in so doing, help you to discover practices that will work more powerfully in your context.  Now, be honest — isn’t that the best practice for you?

[Photo Credit:  Joan Thewlis]


The Challenges of Fragmented Knowledge

In Dave Snowden’s view, “everything is fragmented.” And, he thinks this is a good thing. But it has some challenging implications for knowledge management generally and law firm knowledge management specifically.

Dave sets out his concept of fragmented knowledge in the May 2008 KM World Magazine in which he points to “the shift during the life span of knowledge management from the `chunked’ material of case studies and best-practice documents to the unstructured, fragmented and finely granular material that pervades the blogosphere.” He posits that the effort to structure, summarize, and corporatize information has in fact rooted the knowledge so deeply in specific circumstances that it limits the user’s ability to apply that material to other contexts as things change.
So what are the advantages of the fragmented approach to knowledge? First, Dave suggests that most people would rather seek the advice of several trusted colleagues than hunt through the company KM system for an applicable best practices document. In other words, by embracing fragmented knowledge we are working with rather than against natural tendencies. Second, he reports that his work in homeland security has demonstrated that “raw field intelligence has more utility over longer periods of time than intelligence reports written at a a specific time and place.” In fact, unfiltered narrative accounts tend to pick up more “weak signals (those things that after the event you wished you had paid attention to) than analytical structured thinking.”
If Dave is right that people naturally tend to seek fragmented knowledge, what does that mean for traditional knowledge management? First, we have been focusing on the wrong things. We’ve been trying to heighten control over knowledge and remove ambiguity in world in which the exchange of knowledge is increasingly uncontrolled and ambiguous. Further, we’ve been engaged in a fool’s errand: trying to anticipate all needs and then reflecting the applicable guidance in our KM content (which is a nearly impossible goal), rather than creating in our users “an attitude and capability of anticipatory awareness.”
In the world of fragmented knowledge, the individual must gather at the point of need knowledge fragments from a variety of informal sources (e.g., colleagues, blogs, wikis, etc.) and then blend that information on the fly to reach conclusions and take action. In the context of a law firm, this means that we have to rely on the ability of each lawyer to gather and analyze appropriately information from a wide variety of known and unknown sources, and then make the right decision for the client. They have to reinvent the wheel each time. From a risk management perspective, this is a little terrifying. From an efficiency perspective, it doesn’t make a lot of sense either. The beauty of best practices has been that they are a reflection of the collective wisdom of the firm and they point lawyers to action that is more likely than not to avoid harm for the client and the firm. Delegating this to individuals of varying levels of experience and judgment radically changes the risk exposure for the client and the firm.
If Dave is right that the world is increasingly one of fragmented knowledge, law firm knowledge managers are going to have to rethink the way they achieve their goals of improved client service and risk management.

Best Practice vs Next Practice

Mark Gould’s comment on my previous post (Not Quite) Best Practices pointed me to Derek Wenmoth’s blog post on Best Practice vs Next Practice. Derek makes the interesting observation that while best practice is a snapshot of what we know has worked well in the past, next practice is an attempt to take that prior experience and improve upon it rather than merely replicate it. This notion of next practice fits nicely with the Appreciative Inquiry approach to change. Here’s the money quote from Derek:

Best Practice asks “What is working?”, while Next Practice asks “What could work – more powerfully?”

Best practice has often functioned as a type of insurance policy: if you’ve followed best practices, who can criticize? However, the focus on next practice moves us out of the insurance policy nature of best practice into imagination and innovation. Very dangerous. And yet, so necessary.

Mark says that he might blog on this concept of next practice. I’m looking forward to reading his observations. In the meantime, thank you Mark and Derek for giving us a more nuanced way of thinking about best practices.