May 3, 2012

Lessons Learned Database hopper

I like to think of a lessons learneddatabase as a hopper that fills up, empties, fills up, empties, fills up, empties and so on.  This is counter to a view that I find very often which is that a lessons learned database fills up, fills up, fills up, fills up and you use a search engine to find things in it.

Why do I prefer the first description?
For me a lessons learned database is part of a lessons learned process which consists of steps such as;

·         We plan to do something

·         We do something and gain experience

·         We identify learning

·         We decide what to do about that learning and assign an action to someone to carry out that activity

·         The experience is now fully embedded in our process, training material, standards etc

·         We can archive the description of the experience as it is now embedded in our process, training material, standards etc

Why do I prefer this approach; my experience of line management, design, operations, maintenance and even during knowledge management consultancy is that people are very, very busy and unless you make it easy for them to do something they will default either to the way they have always done it (or how they did it in their last company, if they are new to your company) or will ask someone next to them at that moment.  If they have to go to another place (screen, physical library etc) to make use of the experience of the company, they may say they are too busy and not bother.  If however that experience is already built into the documents and processes they are already using, by default they will use the latest experience of the company.  Hence the number of lessons in the lesson learned database goes up and down as things are learned and then embedded in the processes, training material and standards of the organisation.

Why might a company not take the hopper approach to lessons learned?  Well one possible answer is that it is relatively easy to purchase a lessons learned database and mandate that people have to submit their lessons to the database or people have to review the database for lessons that might apply to their activity.  Nice, clean, simple instructions.  The problem is that the database will grow and grow and grow, until perhaps you might not even be able to find that one piece of experience that would make all the difference to what you are about to work on.  The search engine returns so many ‘lessons’ on turbines or hydrodynamics or funding applications or finite element analysis or EU importation regulations or customer segmentation etc that it is of no use to you.

With a hopper approach, you would only have ‘lessons’ that haven’t yet been embedded in the process, training material or standards of the organisation.

May 1, 2012

Innovating Through Benchmarking

I have read several blogs and postings which seemed to indicate a concern that managing knowledge in some way inhibits innovation. I thought it might be useful to share my thoughts on why I think that is not the case. I me it is about being clear on what you're trying to achieve. I tend to think about it as three distinct but perhaps overlapping areas.

Reaching the benchmark

The first one is what I tend to think about as reaching the benchmark. This is where you have multiple teams or operations which are producing different levels of productivity, efficiency or quality. It doesn't really matter what the metric is, the key point is that there was a difference across the various teams or locations. It could be patient recovery time, number of faults per hour, down time, time to respond to incoming calls in a call centre, whatever. The key is that there is a difference. And that difference equates to loss. Loss is in my vocabulary. I don't like it, I don't want to, let's get rid of it. What I'm trying to do in this situation is understand what the top performing teams or operations are doing that allows them to perform so much better than the bottom performing teams or operations. This can be very subtle, difficult to understand but because of the value that it can represent to your organisation it is very well worth while pursuing. It may take considerable time and effort to fully understand what it is that is making the difference. If you can quantify the value, in whatever terminology you use to measure value, you can justify the investment in understanding what is making the difference.

Once you have identified and package the know-how that makes the difference the next step is to transfer and embed that know-how in the other teams and operations. By doing so, you will bring all of the teams and operations up to the benchmark. The value in doing this can be immense, if you have an oil company it can be hundreds of millions of dollars. If your hospital they can be lives saved. If you are a famine relief agency, it can be putting food in the mouths of the starving.
Waste and loss are bad and must be eliminated.

Once you have all of the teams and operations up to the existing benchmark, we move on to the next stage which is extending the benchmark.

Extending the benchmark

Learning should not be seen as a one time activity. While the initial objective is to bring all of the teams and operations up to the benchmark, you should also halve be continues learning loop that captures experience that will extend the benchmark. One way of doing this is to start the cycle by identifying the benchmark that you wish your teams and operations to achieve. Collect good practice from teams and operations who are achieving this benchmark, distil it into the first edition of the best practice and send it to each of the teams and operations who are currently below the benchmark. Invite them to apply the first edition of the best practice to bring them up to the current benchmark unless they can offer a suggestion that is even better than the advice contained in the first edition of the best practice. Use subject matter experts to evaluate any advice that is received and if appropriate update the first edition of the best practice and reissue it as a second edition of the best practice. Continue round the cycle updating the best practice with the experience of the various teams and operations always looking to extend the current benchmark beyond its current position.

Extending the benchmark will almost certainly be an incremental activity but will demonstrate to your teams and operations your relentless obsession with driving out waste and loss. Moving the benchmark forward in this incremental fashion is extremely important in operations where changes to the process or activity must be thought through before any changes are made. In these circumstances it is normal to find a management of change process (MOC) which is designed to prevent ad hoc, not well thought out changes being made to a process which may result in safety breaches or violations.

Breakthrough thinking

The next area is what I like to think of as break through thinking. This is where you have decided to attempt to do something different. The rationale behind how you got to this step will be different depending on industry, location, business environment, a whole range of factors. What is important for me is that this is a deliberate decision to do something different. It is a decision that is taken within a controlled environment and is progress in a controlled manner. Would you want the operator of a nuclear reprocessing facility to decide one day just to try something different because it might improve the productivity of the nuclear process? I don't think so.

Innovation or breakthrough thinking should be done in a controlled way and the ideas and concepts generated exposed to peers and subject matter experts before implementation. The innovation can come through suggestion schemes from outlying operators, research laboratories, engineering teams, market analysis of universities. It is a process that should run in parallel with extending and implementing standardisation via benchmarking. Uncontrolled innovation would not be welcomed in most of the environments that I have worked in during my career. Innovation however is something that has been encouraged throughout my career and something that is prized, but only when it is in a controlled fashion.

As an engineer I have been taught to develop prototype's, test them by piloting, modify and update the prototype, finalise the design that will be introduced and then roll it out. Different groups were charged with the activities, one group may be charged with improving the overall efficiency and productivity of the process and plants while another group would be charged with coming up with a variation or extension to existing products and services. Another group would be charged with coming up with market changing processes and products. Each had a clearly remit within which to work. What was clear in all of these instances was that knowledge is of value when it is applied.

New measurement unit

When you are developing a new technology or a new process sometimes it can be difficult to measure performance. My advice in these situations is to avoid if possible inventing a new measurement system as you will get into all sorts of discussion about whether the measurement is valid or not.

I was recently reading about the early days of laser technology and according to the article the strength of lasers was measured in Gilletes. One Gillete was the thickness of a razorblade produced by a very well-known manufacturer of razor blades. The power of the laser was indicated by the number of razorblades that it could cut through. Hence when someone refer to the power of a given laser as being 5 Gillete, this meant that the laser had the power to penetrate through five razorblades at one time. I have to admit to not having come across the Gillete measurement before now.

The razor blade manufacturer's drive to standardise the thickness of the razorblades had an unintended consequence which led to laser manufacturers being able to innovate and create.


Knoco Ltd