Thursday, 30 December 2010

Model 4: data, information, knowledge, wisdom

The Model from Infovis below shows just what I always struggle in explaining to tohers as to why the results of one project cannot just be pushed onto other, adopting, groups.
The producers, the pilot projects, create the basic data and some information (patterns) about it. The consumers (the adopters) need to retest this in their own environment.  Any pilot project that can be written up in a way that helps the consumer to bridge the gap from information to knowledge will likely be more successful at spread than others. This can include things like: you can adapt this in the following way, we did the following and it didn't work but it may work in xyz circumstances etc.

Tuesday, 28 December 2010

Model 3: data, information, knowledge, wisdom

One of the QI refrains is "increase the capability and capacity of employees". While this is a great concept, easy to declare and impossible not to support, for me it lacks any concrete applicability. What exactly is meant by this? There is another one of our data-information-knowledge-wisdom models which may help pin down what might be meant. Next time you hear somebody say the capacity/capability thing then whip this model out and ask them to explain their intentions and expectations along the data to wisdom curve.

 The challenge here is to produce learning experiences that enable someone to move up the curve. In my experience, much of healthcare improvement work is focused on developing data based skills - how to measure change. Some people get to the information stage where they learn to look for patterns, say by using SPC charts. Can they port this knowledge to other projects in a predictable way? Can they make intelligent choices? To what extent do the participants on a QI project become "wise"?

The above curve comes from Designing Knowledge Eco-Systems for Communities of Practice.  The web resources are excellent - especially if you are developing CoP's as part of your QI strategy.

Wednesday, 22 December 2010

Model 2: data, information, knowledge, wisdom

There is an excellent post about wisdom on one of my favourite sites - Big Dog and Little Dog's Performance Juxtaposition (yes, really!).  It's a place I recommend you spend some time checking out.

I like the way this model gets me thinking about how we learn - there are connections here to the Honey & Mumford Learning styles.  The fact that there is a continuum for context is also thought-provoking.  This model has left me wondering whether in many of our quality improvement projects we focus too much on the bottom left hand corner and assume the progression to wisdom will be automatic. What would happen if we thought more and designed more of the journey to wisdom into our improvement interventions (and by extension, into our evaluations of projects)?

Monday, 20 December 2010

Model 1: data, information, knowledge, wisdom

The old adage goes along the lines that knowledge can be defined as knowing a tomato is a fruit, and that wisdom is therefore knowing that you don't add a tomato to a fruit salad...  There are a number of models and frameworks that investigate the data-information-knowledge-wisdom continuum and in the this series of posts I cover a few of these.

For the theorist a good place to start is with an online paper A Primer:, Enterprise Wisdom Management and the Flow of Understanding by

I like the way environment and context have come into play as important factors in understanding that knowledge and wisdom have a contextual perspective.

Friday, 17 December 2010

New Paper: What is the experience of national quality campaigns?

I liked the conclusion in this paper - "..may depend on.." as it summarised my experience of national quality campaigns - the results depend on a multitude of factors - and I would add depends on the perspective/s of the stakeholders involved.

 Health Serv Res. 2010 Dec;45(6 Pt 1):1651-69.

What is the experience of national quality campaigns? Views from the field.

OBJECTIVE: To identify key characteristics of a national quality campaign that participants viewed as effective, to understand mechanisms by which the campaign influenced hospital practices, and to elucidate contextual factors that modified the perceived influence of the campaign on hospital improvements.

CONCLUSIONS: The impact of national quality campaigns may depend on both campaign design features and on the internal environment of participating hospitals.

Wednesday, 15 December 2010

New Paper: Short and long term effects of a QI collaborative on diabetes mgt

I'm always on the look out for evidence of the sustainability of QI projects. A new paper in Implementation Science has some thoughts on this though I am not convinced 1 year classes as "long term".

Implement Sci. 2010 Nov 28;5(1):94. [Epub ahead of print]

Short- and long-term effects of a quality improvement collaborative on diabetes management.

INTRODUCTION: This study examined the short- and long-term effects of a quality improvement collaborative on patient outcomes, professional performance, and structural aspects of chronic care management of type 2 diabetes in an integrated care setting.
CONCLUSIONS: At a time of heightened national attention toward diabetes care, our results demonstrate a modest benefit of participation in a multi-institutional quality improvement collaborative focusing on integrated, patient-centered care. The effects persisted for at least 12 months after the intervention was completed. 

Monday, 13 December 2010

New Paper: How to use an article about quality improvement (JAMA Nov 2010)

One of the difficulties in spread and adoption is, on the one hand avoiding the temptation to take the results from one project and then do a back of the envelope calculation and announce if the results were spread then there would be x billion savings etc; and on the other hand, if you're a project lead, how do you read a piece of evidence and work out its relevance for your own work?  There is a new paper out which touches on this subject.

 JAMA. 2010 Nov 24;304(20):2279-87.

How to use an article about quality improvement.


Quality improvement (QI) attempts to change clinician behavior and, through those changes, lead to improved patient outcomes. The methodological quality of studies evaluating the effectiveness of QI interventions is frequently low. Clinicians and others evaluating QI studies should be aware of the risk of bias, should consider whether the investigators measured appropriate outcomes, should be concerned if there has been no replication of the findings, and should consider the likelihood of success of the QI intervention in their practice setting and the costs and possibility of unintended effects of its implementation. This article complements and enhances existing Users' Guides that address the effects of interventions--Therapy, Harm, Clinical Decision Support Systems, and Summarizing the Evidence guides--with an emphasis on issues specific to QI studies. Given the potential for widespread implementation of QI interventions, there is a need for robust study methods in QI research.

Friday, 3 December 2010

Testing the transferability of ideas and practises

We are not short of good ideas on how to improve healthcare. Nor are we short of good examples and case studies. What we are short of is evidence that good examples are transferable form one place to another.

I really like the way in which NHS Improvement has reviewed their own work with the aim of testing out how transferability their work is. The organisation works in an unpretentious way, in that their work is organised by healthcare pathways rather than improvement techniques. In their Cancer stream they have reviewed how good examples are transferred from one team or organisation to another and made notes about the issues and difficulties. These reports are easy to read and full of the lessons of the spread and adoption process.

Wednesday, 1 December 2010

Web 2.0 not role modelled is inauthentic

Anyone or any organisation that flies the flag of change management and improvement always has the difficult task of acting what they say, namely being a role model for what they espouse.  It is difficult to get right all the time and I certainly don't.

One positive example I have experienced is the Institute for Healthcare Improvement who, a while back, took on the challenge of improving their invoicing and payment system on the basis they couldn't teach others to do it unless they (a) were a good role model and (b) learnt from their own experience.

The worst of management consultancy is when concepts and theories from books are copied onto PowerPoint slides and then used to train others. Where the trainers have no experience of the content their audience will soon figure out the dissonance and leave the session - either physically or mentally.

In England we have a rash of NHS Improvement related organisations trying to get onto the social media bandwagon. I am all for it as I believe it is an essential tool for communicating and engaging with others. However, when the organisations involved have no official and monitored Facebook page, do not use blogs (as in few is any of their Executive Teams or Senior Staff use them), have never used a wiki, do not use RSS feeds themselves as part of their own learning, or never used a discussion forum in-house etc - then the exhortations and training comes across as inauthentic.

There are one or two NHS Improvement groups, like NHS Improvement, who are using Web 2.0 techniques to their advantage and I like the way they are starting with themselves and learning how to use them in-house, before going outside.  I am sure they will be excellent role models for the future.