Introduction
This is a short intro. This post is primarily about the concept of an “information production line” in organisations and the risk we face when we let our view of quality management of information become one of better inspection of defects out of a process. However, I’d be lying if I said it wasn’t also a chance for me to trumpet a good news story about innovation and general cleverness in a young Irish software company in the Information Quality space.
The Information Production Line
Modern businesses rely on the flow of information along a production line. In this production line, data and information are taken, acted upon, combined with other elements, shared, and applied to produce value for the organisation.
Whether it is a sales lead being captured, an order being taken, a product specification being produced, or staff member being hired, information is captured, created, consumed and processed at each stage in the production line from entry to delivered objective.
Everyone and their dog agrees that the best practice and optimum strategy to ensuring quality at minimum cost is to apply your quality metrics and remedial actions as close to the point of first creation as possible, with the ideal being to have zero defects entering your process flow in the first place. Vendors often talk about the “information quality firewall”. Emphasis is placed on the importance of good governance over the information asset to ensure and assure quality. Increasingly emphasis is being placed on the importance of building information quality processes into ETL operations and into data migration strategies.
All of which sounds great and is a significant step forward from where we were 5 years ago. However, are we simply reaching the point where we are starting to pay to have people running around the edges of our production lines sweeping up the crud that falls off the line or sifting through incoming parts bins to seperate out the “good” information parts from the “bad” information parts.
But is that really managing the quality is is it just being really good and very fast at wielding a big dustpan and brush around our information processes to keep the factory clean without actually tackling the real root causes of poor quality? Given that information is created through the operation of processes that are often many steps removed from the final ERP or CRM system (such as spreadsheet based order forms or product specifications) is it good enough that we are relying on inspection effectively at the end of the line to fix our quality problems?
A timely lesson from Manufacturing
A few years ago I had the honour of being taught about Deming’s Theory of Knowledge by Joyce Orsini who worked with Deming and is the Director of the Deming Scholars MBA at Fordham University (that’s were I first took part in the Red Beads game, a tool I’ve used myself since in lectures in DCU).
Over the course of the two day tutorial, Dr. Orsini shared the story of when Deming went to a large car manufacturer in the US and had a tour of the plant. As he stood with the plant manager on a gantry overlooking the line, Deming noticed a robot arm repeatedly knocking a dent into the boot (trunk) lid of every car in almost exactly the same place. Every car that rolled of the line had that dent. When Deming raise this point with the plant manager, who was in mid flight singing his own praises for the quality of the finished product, the plant manager replied
It’s OK, we’ve got a team of panel beaters who sort that out for us before the car leaves the plant
This was institutionalised scrap and rework masquerading as, or mistaken for, management of quality. If we stand on the same gantry looking over our information production lines today, in many companies we can subsitute the panel beaters for the “Information Quality team” and their tools, struggling to beat out the dents and dings in our information products before they are consumed by the ERP systems or, more importantly, the knowledge workers or down stream processes using those ERP systems.
Lovely. All the right things are being said but we might actually be lulling ourselves into a false sense of quality.
Another way?
But is there another way? One of the key lessons from manufacturing quality is that the output of one part of the process is the input to the next. Rather than inspecting defects out of 1000 widgets, it is far better not to accept any defective widget-bits (the raw materials for widgets) and to work with the widget-bit suppliers to ensure that they are providing widget-bits of the required quality. The quality required is determined by the needs of the process, and different processes using the same widget-bits might have different thresholds for quality.
This requires investment in tools to ensure the correct calibration of the ‘machinery’ of your process and ensure timely measurement of the quality of your process inputs to ensure that they won’t bugger up your process further down the line. This is possible with many of the current data quality tools, but it is not always easy to embed the necessary checks right at the point of entry. Often the point of inspection is two or three steps down the process chain and rework of rejects becomes a more costly effort.
By understanding your process, the objective of the process (what the finished result should be) and the various things that can prevent you achieving the objectives of your process (let’s call them risks), you can begin to determine the checks you require to get the information right first time and remove the need for rework through prevention.
In Risk Management parlance you move from a Detective Control to a Preventative control.
The Challenge
The challenge is to embed those checks and measures consistently across your organisation in your Excel workbooks, Infopath forms, SAP screens or call centre applications so that the same rules, the same checks and the same hyperdata, metadata, business rules and reference data are applied consistently and can be managed as yet another structured and controllable asset in the organisation. And all this needs to be done in an environment that helps to break down barriers and (as Andrew Brooks puts it) helps “make the invisible visible”.
One offering that looks like it might be well on the road to enabling this “Holy Grail” is the new offering from Irish software firm Clavis Technology. (I am sure there are others, feel free to share via the comments, but I’m Irish so I’ll focus on the local team for now ).
I don’t want this to come across as a sales pitch for them here (I try to keep the blog tool and vendor neutral), but I was very impressed with their thinking about a year ago when they invited me in to have a chat about their ideas. Earlier this week they invited me in again to show me the current version of the tool, which is being offered as a SaaS offering.
The product is very definitely shaping up to deliver what they set out to achieve, which is a process-based view on information quality focussing on getting your data quality checks and controls in as early in your process as possible with centralised governance and control coupled to a flexible ‘plug-in’ model. The entire application and framework seems to have been designed the need to enable subject matter experts to collaborate more effectively and to let centrally governed rules, checks, and reference data be embedded easily throughout the organisation.
Given the state of the Irish economy the vision and foresight of the Clavis Technology team is something that we should be celebrating as it shows the level of innovation that indigenous Irish companies are capable of (and it’s not the first time for these guys – they founded Similarity Systems which was subsequently bought by Informatica a few years ago).
Conclusion
Just as in manufacturing some organisations lulled themselves into believing that quality meant having men with hammers to knock dents and dings out before the cars rolled out of the factory, there is a risk that in Information Quality we might risk lulling ourselves into a comfort zone we have no right to be in. By putting very efficient filters up between our filthy rivers (source processes) and our data lakes (databases) we may think we have solved the problem of poor quality data. But as Tom Redman teaches, the correct approach for sustainability is to track the source of the pollution and eliminate it, so the river is clean and the lake is clean.
Having remediation teams to review and fix errors is not value adding. Designing effective processes and applying effective measurement and control to those processes to prevent defects is a sustainable and effective approach to ensuring and assuring quality. Ensuring that those efforts are co-ordinated across your organisation is a key management challenge.
Put yourself on the gantry beside Deming. Look down at your information production line. Now, ask yourself if you’ve got a quality management function and culture or just a team of panel beaters?
Comments
5 responses to “End to End in the the Information Production Line”
[…] Here is the original:Â End to End in the the Information Production Line […]
Hi Daragh,
Nice post. The production line analogy is good, although it suggests that raw data (components) is taken in at point A and assembled to produce a single information product at point B; the reality is, of course, that raw data can be used many times and contribute to many different information products.
Like a manufacturing component, data can be copied/reproduced and, in doing so, errors may be introduced. As with the robot introducing dents, faults can occur during the process, not just at the point of entry.
The concept of a Data Quality Firewall is concerned with the quality of data components at the point of entry. My preference is for a Data Quality Gateway, which ensures conformity with business rules at every interface, cognisant of the fact that different information products may have different requirements and tolerances of the data.
Datanomic has been providing the ability to implement data quality as a service since 2005, with business rules stored in a central repository that ensures conformity of data across batch and real-time processes. Our approach encourages people to improve inherited data where necessary (the panel beaters) but also protect the information asset from new dents by implementing a data quality gateway that prevents errors at the point of entry and all along the information production line.
Steve
Steve,
Thanks for the comment. You are, of course, correct in saying that the information production line analogy can often miss the point that the same item of data can be used in different information products without being lessened or divided. Tom Redman writes about this unique property of information/data as an asset in his book Data Driven.
Datanomic’s tool is another impressive one that I definitely should have had in my noggin when writing the post. Perhaps I was just carried away by the Irish wins in the rugby and boxing(?) Datactics (in Northern Ireland) is another example. I think it is important that, as professionals in this industry, we don’t get hung up on the US-based players and remember that there is excellent innovation happening in Europe.
Sometimes the British Isles seem like a US costal area.
Is anyone capable of naming other innovative vendors around the rest of the world?
Btw: I love Irish music not at least U2 – but prefer handball to rugby.
Henrik,
Thanks for the comment. It is an often forgotten fact that the Irish economy didn’t rely entirely on construction until relatively recently and we have a tradition of innovation, particularly when it comes to information management.
For example the true father of the Data Warehouse back in his IBM days is Barry Devlin, an Irishman and all round nice chap.
As for Irish music, I personally am not too fond of U2. I prefer Rory Gallagher. And as for sport, if you like handball you’d love hurling. Apparently it is the fastest moving ball game in the world.