Blog

  • Bank of Ireland Overcharging – another follow up

    Scanning the electronic pages of the Irish Independent this morning I read that

    1. They claim to have had the scoop on this story (no, it was Tuppenceworth.ie and IQTrainwrecks.com)
    2. They have “experts” (unnamed ones) who tell them that the actual number of impacted customers over the weekend could be up to 200,000.
    3. “Some other banks admitted there have been cases where Laser payments have mistakenly gone through on the double. But they said they have not had any serious problems.” (BOI had that angle on the issue back in June).
    4. The bank cannot guarantee that it won’t happen again.

    I’ll leave points one to three for another time and focus at this point (as my bus to Dublin leaves soon) on the matter of the Bank of Ireland not being able to guarantee that it won’t happen again.

    The Nature of Risk

    Fair play to BOI for admitting that they can’t guarantee that this problem won’t happen again. It has happened before (in May), it has happened now, it is only prudent to say that it may happen again.

    But are they not able to guarantee that it won’t happen again because they understand the causes of this problem, have properly mapped the process and information flows, understand the Happy Path and Crappy Path scenarios and the triggering factors for them  and have established robust detective and preventative controls on the information processes to prevent or flag errors to have a foolproof process but are hedging their bets against the occurence of idiots?  In that case, they have a managed process which will (hopefully) have effective governance structures around it to embed a quality culture that promotes early warning and prompt action to address the incidence of idiots which inevitably plagues fool proof processes.

    Or are they unable to guarantee it won’t happen again because they lack some or all of the above?

    Again I am forced to fall back on tortured analogies to explain this I fear.

    A few years ago I had an accident in my car. I am unable to guarantee that for the rest of my driving life I won’t have another accident. Hence I have taken out insurance. However, I have also taken a bit of time to understand how the first accident occured and modified my driving to improve my ability to control the risk of accident. Hence I am able to get insurance.

    Had I not modified my driving the probability of the same type of accident occuring would have been high, and as a result the cost of my insurance would be higher (no no-claims bonus for example).

    However, because I understand the “Happy Path” I want to travel on when driving and also understand the Crappy Path that I can wind up on if I don’t take the appropriate care and apply the appropriate controls (preventative and detective) on how I drive I haven’t had an accident since I reversed into the neighbour’s car many moons ago.

    I can’t guarantee it won’t happen again, but that is because I understand the nature of the risk and the extent to which I can control it, not because I am blissfully unaware of what is going on when I’m driving.

    Information Quality and Trust

    What does this idea of Information Quality and Trust mean? Well, the Indo put it very well this morning:

    Revelations about the Laser card glitch, disclosed in yesterday’s Irish Independent, have shaken confidence in banks’ payments systems at a time when people are nervous about all financial transactions.

    As I have said elsewhere, information is the key asset that fuels business and trade and is a key source of competitive advantage. In a cashless society it is not money that moves between bank accounts when you buy something, it is bits of information. Even when you take money from an ATM all you are really doing is turning the electronic fact into the physical thing it describes – €50 in your control to spend as you will.

    When the quality of information is called into question there is an understandable destruction of trust. “The facts don’t stack up”…. “the numbers don’t add up”… these are common exasperated comments one can often hear, usually accompanied by a reduction in trust in what you are being told or a reluctance to make a decision on that information.

    Somewhat ironically, it is the destruction of trust in the information around sub-prime mortgage lending and the bundled loan products that banks started trading to help spread their risk of in mortgage lending that has contributed to the current economic situation.

    In the specific case of Bank of Ireland and the Laser card problems, the trust vacuum is compounded by

    • The bank’s failure to acknowledge the extent or timescale of the issue
    • The bank’s apparent lack of understanding of how the process works or where it is broken. [Correction & Update: Yesterday’s Irish Daily Mail says that the Bank does know what caused the problem and is working on a solution. The apparent cause is very similar to the hypotheses I set out in the post  previous to this one.]

    This second one isn’t helped unfortunately by the fact that these issues can sometimes be complex and the word count available to a journalist is not often amenable to a detailed treatise on the finer points of batch processing transactions and error handling in complex financial services products.

    That’s why it is even more important for the bank to be communicating effectively here in a way that is customer focussed not directed towards protecting the bank.

    To restore trust, Bank of Ireland (and the other banks involved in Laser) needs to

    1. Demonstrate that they know how the process works… give a friendly graphic showing the high level process flow to the media (or your friendly neighbourhood voice of reason blogger). Heck, I’d even draw it for them if they would talk to me.
    2. From that simple diagram work out the Happy Path and Crappy Path scenarios. This may require a more detailed drill down than they might want to publish, but it is necessary. (they don’t need to publish the detail though).
    3. Once the Happy and Crappy paths are understood, identify what controls you currently have in place to keep things on the Happy Path. Test these controls. Where controls are lacking or absent, invest ASAP in robust controls.

    The key thing now is that the banking system needs to be able to demonstrate that it has a handle on this to restore Trust. The way to do this is to ensure that the information meets the expectations of the customer.

    I am a BOI customer. I expect to only pay for my lunch once. Make it so

  • Bank of Ireland Double Charging – a clarifying post

    Having spent the day trading IMs and talking to journalists about the Bank of Ireland Laser Card double charging kerfuffle, I thought it would be appropriate to write a calmer piece which spells out a bit more clearly my take on this issue, the particular axe I am grinding, and what this all means. I hope I can explain this in terms that can be clearly understood.

    What is LASER?

    For the benefit of people reading this who aren’t living and working in Ireland I’ll very quickly explain what LASER card is.

    LASER is a debit card system which operates in Ireland. It is in operation in over 70,000 businesses in Ireland. It is operated by Laser Card Services Ltd. Laser Card Services is owned by seven of Ireland’s financial services companies (details here) and three of these offer merchant services to Retailers (AIB, Bank of Ireland, and Ulster Bank). In addition to straightforward payment services, LASER allows card holders to get “cashback” from retailers using their card.

    There are currently over 3million Laser Cardholders nationwide, who generated more than €11.5billion in retail sales in 2008. On average, over 300 Laser card transactions are recorded per minute in Ireland.

    How it works (or at least the best stab I can get at it)

    As Jennifer Aniston used to say in that advert… “now for the science bit”. Children and persons of a sensitive disposition should look away now.

    One problem I’ve encountered here is actually finding any description of the actual process that takes your payment request (when you put your card in the reader and enter your pin) , transfers the money from you to the retailer, and then records that transaction on your bank statement.  Of course, there are valid security reasons for that.

    So, I’ve had to resort to making some educated guesses based on my experience in information management and some of the comments in the statement I received from Bank of Ireland back in June. If I have any of this wrong, I trust that someone more expert than me will provide the necessary corrections.

    1. The card holder presents their card to the retailer and puts it in the card reader. The card reader pulls the necessary account identifier information for the card holder for transmission to the LASER processing system (we’ll call this “Laser Central” to avoid future confusion).
    2. The retailer’s POS (point of sale) system passes the total amount of the transaction, including any Cashback amount and details of the date, time, and retailer, to the Laser card terminal.  Alternatively, the Retailer manually inputs the amount on the Laser POS terminal.
    3. This amount and the amount of the transaction is transmitted to the Laser payment processing systems.
    4. ‘Laser Central’ then notifies the cardholder’s bank which places a “hold” on an amount of funds in the customer’s account. This is similar in concept to the “pre-authorisation” that is put on your credit card when you stay in a hotel.
    5. At a later stage, ‘Laser Central’ transmits a reconciliation of transactions which were actually completd to the Laser payment processing sytem. This reconciliation draws down against the “hold” that has been put on funds in the card holder’s account, which results in the transaction appearing on the card holder’s bank statement.

    Point 5 explains why it can sometimes take a few days for transactions to hit your account when you pay with your laser card.

    The Problem

    The problem that has been reported by Bank of Ireland today and which was picked up on by Simon over at Tuppenceworth.ie in May is that customers are being charged twice  for transactions. In effect, the “hold” is being called on the double.

    Back in May, Bank of Ireland explained this as being (variously):

    • A problem caused by a software upgrade
    • A problem caused by retailers not knowing how to use their terminals properly
    • A combination of these two

    The Software Upgrade theory would impact on steps 3,4, and 5 of the “strawman” Laser process I have outlined above. The Retailer error theory would impact on steps 1 and 2 of that process, with potentially a knock on onto step 5 if transactions are not voided correctly when the Retailer makes an error.

    But ultimately, the problem is that people are having twice as much money deducted from their accounts, regardless of how it happens in the course of this process. And as one of the banks that owns and operates Laser Card Services, Bank of Ireland has the ability to influence the governance and control of each step in the process.

    The Risk of Poor Information Quality

    Poor quality information is one of the key problems facing businesses today. A study by The Data Warehousing Institute back in 2002 put the costs to the US economy at over US$600billion. Estimated error rates in databases across all industries and from countries around the world range between 10% and 35%. Certainly, at the dozens of confernces I’ve attended over the years, no-one has ever batted an eyelid when figures like this have been raised. On a few occasions delegates have wondered who the lucky guy was who only had 35% of his data of poor quality.

    The emerging Information Quality Management profession world wide is represented by the International Association for Information & Data Quality (IAIDQ).

    Information Quality is measured on a number of different attributes  (some writers call these Dimensions). The most common attributes include:

    • Completeness (is all the information you need to have in a record there?)
    • Consistency (do the facts stack up against business rules you might apply- for example, do you have “males” with female honorifics? Do you have multiple transactions being registered against one account within seconds of each other or with the same time stamp?)
    • Conformity (again, a check against business rules  – does the data conform to what you would expect. Letters in a field you expect to contain just numbers is a bad thing)
    • Level of duplication ( simply put… how many of these things do you have two or more of? And is that a problem?)
    • Accuracy (how well does your data reflect the real-word entity or transaction that it is supposed to represent?)

    In models developed by researchers at MIT there are many more dimensions, including “believability”.

    In Risk Mangement there are three basic types of control:

    • Reactive (shit, something has gone wrong… fix it fast)
    • Detective (we’re looking out for things that could go wrong so we can fix them before they become a problem that has a significant impact)
    • Preventative (we are checking for things at the point of entry and we are not letting crud through).

    Within any information process there is the risk that the process won’t work the way the designers thought/hoped/planned/prayed (delete as appropriate) it would.  In an ideal world, information would go in one end (for example the fact that you had paid €50 for a pair of shoes in Clarks on O’Connell Street in Dublin on a given day) and would come out the other end either transformed into a new piece of knowledge through the addition of other facts and contexts (Clarks for example might have you on a Loyalty card scheme that tracks the type of shoes you buy) or simply wind up having the desired outcome… €50 taken from your account and €50 given to Clarks for the lovely pair of loafers you are loafing around in. This is what I term the “Happy Path Scenario”.

    However lurking in the wings like Edwardian stage villains is the risk that something may occur which results in a detour off that “Happy Path” on to what I have come to call the “Crappy Path”. The precise nature of this risk can depend on a number of factors. For example, in the Clarks example, they may have priced the shoes incorrectly in their store database resulting in the wrong amount being deducted from your account (if you didn’t spot it at the time). Or, where information is manually rekeyed by retailers, you may find yourself walking out of a shop with those shoes for a fraction of what they should have cost if the store clerk missed a zero when keying in the amount (€50.00 versus €5.00).

    Software upgrades or bugs in the software that moves the bits of facts around the various systems and processes can also conspire to tempt the process from the Happy Path. For example if, in the Laser card process, it was to be found that there was a bug that was simply sending the request for draw down of funds against a “hold” to a bank twice before the process to clear the “hold” was called, then that would explain the double dipping of accounts.

    However, software bugs usually (but not always) occur in response to a particular set of real-world operational circumstances.  Software testing is supposed to bring the software to as close to real-world conditions as possible. At the very least the types of “Happy Path” and “Crappy Path” scenarios that have been identified need to be tested for (but this requires a clear process focus view of how the software should work). Where the test environment doesn’t match the conditions (e.g. types of data) or other attributes (process scenarios) of the “real world” you wind up with a situation akin to what happened to Honda when they entered Formula 1 and spent buckets of cash on a new wind tunnel that didn’t come close to matching actual track conditions.

    This would be loosely akin to giving a child a biscuit and then promising them a second it if they tidied their room, but failing to actually check if the room was tidied before giving the biscuit. You are down two bikkies and the kid’s room still looks like a tip.

    In this case, there is inconsistency of information. The fact of two “draw downs” against the same “hold” is inconsistent. This is a scenario that software checks ont he bank’s side could potentially check for and flag for review before processing them. I am assuming of course that there is some form of reference for the “hold” that is placed on the customer’s account so that the batch processing knows to clear it when appropriate.

    In the case of my horrid analogy, you just need to check within your own thought processes if the posession of two biscuits is consistent with an untidy room. If not, then the second biscuit should be held back. This is a detective control. Checking the room and then trying chasing the kid around the houseto get the biscuit back is a reactive control

    Another potential risk that might arise is that the retailer may have failed to put a transaction through correctly and then failed to clear it correctly before putting through a second transaction for the same amount. This should, I believe, result in two “holds” for the exact same amount being placed on the customer’s account within seconds of each other. One of these holds would be correct and valid and the process should correctly deduct money and clear that hold. However it may be (and please bear in mind that at this point I am speculating based on experience not necessarily an in-depth insight into how Laser processing works) that the second hold is kept active and, in the absence of a correct clearance, it is processed through.

    This is a little more tricky to test for in a reactive or detective controls. It is possible that I liked my shoes so much that I bought a second pair within 20 seconds of the first pair. Not probable, but possible. And with information quality and risk management ultimately you are dealing with probability. Because, as Sherlock Holmes says, when you have eliminated the impossible what remains, no matter how improbable, is the truth.

    Where the retailer is creating “shadow transactions” the ideal control is to have the retailer properly trained to ensure consistent and correct processes are followed at all time. However, if we assume that the idea of a person validly submitting more than one transaction in the same shop for the same amount within a few moments of each other is does not conform with what we’d expect to happen then one can construct a business rule that can be checked by software tools to pick out those types of transaction and prevent them going through to the stage of the process that takes money from the cardholder’s account.

    Quite how these errors are then handled is another issue however. Some of them (very few I would suggest) would be valid transactions. And this again is where there is a balance between possiblity and probability. It is possible that the transaction is valid, but it is more probable that it is an error. The larger the amount of the transaction, the more likely that it would be an error (although I’ve lost track of how many times I’ve bought a Faberge egg on my Laser card only to crave another nanoseconds later).

    Another key area of control of these kinds of risk is, surprisingly, the humble call centre. Far too often organisations look on call centres as being mechanisms to push messages to customers. When a problem might exist, often the best way to assess the level of risk is to monitor what is coming into your call centres. Admittedly it is a reactive control once the problem has hit, but it can be used as a detective control if you monitor for “breaking news”, just as the Twitter community can often swarm around a particular  hashtag.

    The Bank of Ireland situation

    The Bank of Ireland situation is one that suggests to me a failure of Information governance and Information risk management at at least some level.

    1. It seems that Call Centre staff were aware in May of a problem with double dipping of transactions. This wasn’t communicated to customers or the media at the time.
    2. There was some confusion in May about what the cause was. It was attributed variously to a software upgrade or retailers not doing their bit properly.
    3. Whatever the issue was in May, it was broken in the media in September as an issue that was only affecting recent transactions.

    To me, this suggests that there was a problem with the software in May and a decision was taken to roll back that software change.

    • Where was the “detective” control of Software Testing in May?
    • If the software was tested, what “Crappy Path” scenarios were missed from the test pack or test environment that exposed BOI customes (and potentially customers of the other 7 banks who are part of Laser) to this double dipping?
    • If BOI were confident that it was Retailers not following processes, why did they not design effective preventative controls or automated detective controls to find these types of error and automatically correct them before they became front page news?

    Unfortunately, if the Bank’s timeline and version of events are take at face value, the September version of the software didn’t actually fix the bug or implement any form of effective control to prevent customers being overcharged.

    • What is the scenario that exists that eluded Bank of Ireland staff for 4 months?
    • If they have identified all the scenarios… was the software adequately tested and is their test enviroment a close enough model of reality that they get “Ferrari” performance on the track rather than “Honda” performance?

    However, BOI’s response to this issue would seem to suggest an additional level of contributory cause which is probably more far reaching than a failure to test software or properly understand how the Laser systems are used and abused in “the wild” and ensure adequate controls are in place to manage and mitigate risks.

    A very significant piece of information about this entire situation is inconsistent for me. Bank of Ireland has stated that this problem arose over the past weekend and was identified by staff immediately. That sounds like a very robust control framework. However it is inconsistent with the fact that the issue was raised with the Bank in May by at least one customer, who wrote about it in a very popular and prominent Irish blog. At that time I also wrote to the Bank about this issue asking a series of very specific questions (incidentally, they were based on the type of questions I used to ask in my previous job when an issue was brought to our attention in a Compliance context).

    I was asked today if Simon’s case was possibly a once off. My response was to the effect that these are automated processes. If it happens once, one must assume that it has happened more than once.

    In statistical theory there is a forumla called Poisson’s Rule. Simply put, if you select a record at random from a random sample of your data and you find an error in it then you have a 95% probability that there will be other errors. Prudence would suggest that a larger sample be taken and further study be done before dismissing that error as a “once off”, particularly in automated structured processes. I believe that Simon’s case was simply that random selection falling in my lap and into the lap of the bank.

    Ultimately,  I can only feel now that Simon and I were fobbed off with a bland line. Perhaps it was a holding position while the Bank figured out what was going onand did further analysis and sampling of their data to get a handle on the size of the problem. However, if that was the case I would have expected the news reports to day to have talked about an “intermittent issue which has been occurring since May of this year”, not a convenient and fortuitous “recent days”.

    Unfortunately this has the hallmark of a culture which calls on staff to protect the Bank and to deny the existence of a problem until the evidence is categorically staring them in the face. It is precisely this kind of culture which blinkers organisations to the true impact of information quality risks. It is precisely this kind of culture which was apparent from the positions taken by Irish banks (BOI included) in the run up to the Government Bank Guarantee Scheme and which continues to hover in the air as we move to the NAMA bailout.

    Tthis kind of culture is an anathema to transparent and reliable managment of quality and risk.

    Conclusion

    We will probably never know exactly what the real root cause of the Bank of Ireland double dipping fiasco is. The Bank admitted today in the Irish Times that they were not sure what the cause was.

    Given that they don’t know what the cause was and there are differences of record as to when this issue first raised its head between the Bank and its own customers, it is clear that there are still further questions to ask and have answered as to the response of Bank of Ireland to this issue. In my view it has been a clear demonstration of “mushroom management” of risk and information quality.

    Ultimately, I can only hope that other banks involved in Laser learn from BOI’s handling of this issue which, to my mind, has been poor. What is needed is:

    • A clear and transparent definition of the process by which a laser transaction goes from your fingers on the PIN number pad to your bank account. This should not be technical but should be simple, business process based, ideally using only lines and boxes to explain the process in lay-person’s terms.
    • This can then form the basis in Banks and audit functions for defining the “Happy Path” and “Crappy Path” scenarios as well as explaining to all actors involved what the impact of their contribution is to the end result (a customer who can pay their mortgage after having done their shopping for example)
    • Increased transparency and responsiveness on the part of the banks to reports of customer over charging. Other industries (and I think of telecommunication here) have significant statutory penalties where it is shown that there is systemic overcharging of customers. In Telco the fine is up to €5000 per incident and a corporate criminal conviction (and a resulting loss in government tendering opportunities). I would suggest that similar levels of penalties should be levied at the banks so that there is more than an “inconvenience cost” of refunds but an “opportunity cost” of screwing up.
    • A change in culture is needed away towards ensuring the customer is protected from risk rather than the bank. I am perfectly certain that individual managers and staff in the banks in question do their best to protect the customer from risk, but a fundamental change in culture is required to turn those people from heroes in the dark hours to simply good role models of “how we do things here”.

    There is a lot to be learned by all from this incident.

  • Bank of Ireland Double Charging

    I read with interest a story on the Irish Times website this morning about Bank of Ireland double charging customers for Laser transactions in “recent days”. What interested me is that this was not something that happened in “recent days”. Far from it.

    Back in May 2009, Simon over on Tuppenceworth.ie reported this problem to Bank of Ireland and blogged about his customer service experience. On foot of what Simon had written, I emailed Bank of Ireland to try and get details on the issue before I wrote it up over at IQTrainwrecks.com.

    The response I received from Bank of Ireland on the 4th of June was:

    When BoI receives an authorisation request from a retailer, a ‘hold’ is placed on those funds until the actual transaction is presented for payment. The transaction is posted to the customer’s account on receipt from the retailer.

    Relative to the number of transactions processed there are a very small number of instances where a transaction may appear twice. For example these may occur if the retailer inputted the wrong amount and then re-input the correct amount or the transaction is sent in error twice for authorisation. These types of transactions are not errors or a system issue created by the Bank. The Bank receives an authorisation request and subsequently places a hold on those funds. These types of transactions are not unique to Bank of Ireland.

    Bank of Ireland responds to all customer queries raised in connection with above.

    (I have the name and contact details of the Bank of Ireland Press Office person who sent me that response).

    So. Basically the response in June was “those kind of things happen. They happen in all banks. If customers complain to us we sort it out on a case by case basis”.

    These are the questions I submitted to BoI in June. The quote above was the response I received to these detailed questions. (more…)

  • Golden Databases – another quick return

    I just received an email from an information quality tool vendor. It was sent to an email address I had provided to them in my capacity as a Director of the IAIDQ as part of registering for events they had run.

    The opening line of the email reads:

    I’m writing to you as a follow-up your recent telephone conversation with an [name of company deleted] representative.

    Two small problems

    1. I haven’t had a telephone conversation with any representative from this company regarding any survey or anything else recently. (I did meet one of their Dublin based team for lunch about 6 weeks ago – does that count?)
    2. The personal data I provided to them was not provided for the purpose of being emailed about surveys. (But at least they have an opt out).

    I’m going to take a look at the survey but I bet you the €250 raffle prize for participants that my responses will be statistically irrelvant.

    For a start, the survey is about the importance of the investment in data in IT planning. I’ve never worked in the IT organisation of any business. I have been on the business side interacting with IT as a provider of services to me.

    Also, as a Director of the IAIDQ and as someone trying to set up an SME business, I am basically hands on in all aspects of the business and implemenation of systems (I was working at 2am this morning doing a facelift on my company website and working on a web project for the IAIDQ). So, my responses will be misleading from a statistical point of view.

    It looks like the company in question:

    • Had an email address for me.
    • Knew that my former employer was a customer (customer #1 for this particular company’s DQ offering)
    • Forgot that I’d told their Dublin team that I’d left
    • Had failed to update their information about me.
    • Have recorded a contact with me but have recorded it incorrectly.

    Should I respond to the survey?  Will my responses be meaningful or just pointless crap that reduces the quality of the study?

    I’m goint to ask a friend of mine to write a guest post here on survey design for market research and the importance of Information Quality.

  • Golden Databases – a slight return

    Last week I shared a cautionary note about companies relying on their under-touched and under-loved Customer databases to help drive their business as we hit the bottom of the recessionary curve. The elevator pitch synopsis… Caveat emptor – the data may not be what you think it is and you risk irritating your customers if they find errors about them in your data.

    Which brings me to Vodafone Ireland and the data they hold about me. I initially thought that the poor quality information they have about me existed only in the database being used to drive their “Mission Red” campaign. For those of you who aren’t aware, “Mission Red” is Vodafone Ireland’s high profile customer intimacy drive wher they are asking customers to vote for their preference of add-on packages. Unfortunately, what I want isn’t listed under their options.

    What I want is for Vodafone Ireland to undo the unrequested gender reassignment they’ve subjected me to. (more…)

  • Golden Databases – Caveat Emptor

    I was very interested to read a great post by fellow Irish blogger Damien Mulley in which he wrote:

    …Most companies have massive databases of customer details that are sitting there, gathering dust. Why not work on those databases and poll your customers …

    The context of Damien’s comment was a larger piece about using key assets in your organisation to drive up business or drive down costs. Damien rightly points out that information (on customers) is a valuable asset that most companies simply don’t have working for them.

    He references the Obama campaign, which many hail for using all the bells, whistles and tweets of Web2.0 but which was ultimately driven by good management and application of the campaign’s information assets.

    He’s perfectly correct. I wrote about exactly that topic here (November 2008) and have touched on it in other posts and articles. Unfortunately, Damien’s “Golden Database” needs to come with a big Caveat Emptor. (more…)

  • Is Info Quality Management a Recession Proof Profession?

    Over the past few weeks I’ve been pondering whether or not Information Quality Management is a recession proof profession. Those of you who know me will probably guess that my recent departure from “big company” employment was one of the seeds to this line of thought. Another was the interesting findings contained in the IAIDQ’s recent report on Salary and Job Satisfaction in the Information/Data Quality profession (you can find a copy of the report here).

    First off, the salary survey made for interesting reading because it pegged the average salary (in US dollars) for an Information Quality professional at just over $95000 (EUR 72k approx).  In Europe, the average was $85000 (EUR 65k approx). Cripes, I was a bit less well paid in the old job than I had thought. At those salary levels, the information quality professionals were, overall, satisfied with their lot.

    78% of IDQ professionals say they feel either secure or very secure about their current position, indicating remarkable confidence despite the current difficult economic times

    So… is this one of those mythical recession proof professions?

    Data Data Everywhere….

    We’re fond of saying it but it is true. We live in an increasingly “informationalized” world. Strip away most business models now and you will find that the real value is generated by the smooth flow of information around an organisation. Buying a laptop from Dell? That’s an information flow that needs to pass with out glitch to a factory in Poland (alas no longer Ireland) and also out to suppliers in China and elsewhere to ensure that the bits all arrive together so you receive delivery of a laptop to your door. And let’s not forget about the flow of information about your finance arrangement to fund the purchase. Try to get a phone line connected and you are relying on the quality of information that the call centre agent has about what services are available in your area. Buy a coffee on your debit card… The list goes on.

    Information and data are increasingly being recognised as critical assets to the organisation. Whether it is in Tom Redman’s “Data Driven”, Tom Fisher’s “The Data Asset” or on blogs or webinars, we see an increasing presentation of data as an asset in terms that C-level executives should get. But this isn’t enough (for reasons we’ll come to in a minute)

    But on a more personal level I’ve been busier since I left my old job then I have been at almost any time in my career. I am finding more people connecting with me through the IAIDQ (and other forums) and I am sensing a strong feeling of postive attitude which is far removed from “magical thinking” but is instead grounded on a very clear understanding of how poor quality information contributed to the mess we are in and an equally clear vision of how effective management of the quality of information can help get us out of this situation and, more importantly, help us to better manage the risk of it happening again.

    The Problem…

    The problem we face now as a profession, and this was highlighted very clearly by the IAIDQ’s study, is that of clearly communicating to our employers, customers, and wider audiences, the value of good quality information. 84% of the IAIDQ’s respondents said that this was their biggest challenge. If we face that challenge in a downturn that caused us to look at the relative importance of the assets in our organisation, how can we hope to overcome that challenge now that greenshoots are breaking out all over?  (As I write this Germany is now out of recession, France is on the way, Eurozone is heading positive)

    But Gartner recently shared (well, in 2006) with us this prediction:

    Through 2011, 75 percent of organizations will experience significantly reduced revenue growth potential and increased costs due to the failure to introduce data quality assurance and coordinate it with their data integration and metadata management strategies (0.7 probability).

    In physics, “friction” is the name given to the opposing force that slows the movement of a body. The problem with friction is that it requires you to expend greater effort to achieve the same result. The laws of Conservation of Energy tell us that that extra energy is lost in the form of heat. (Think about the last time you watched your local scout troop light a fire by rubbing two sticks together. Didn’t they look out of breath when they’d finished?)

    More recent Gartner research, published on the 11th of August 2009 [2009 Gartner FEI Technology Study Reveals FinanceManagers’ Perspectives on Data Quality, www.gartner.com] finds that:

    Three-quarters of the respondents consider data quality problems a constraint on, or a barrier to achieving, business success. Even so, only 41% of their organizations have a formal improvement program — the rest are doing nothing formally to improve matters.

    So. There we have it. Confirmation that poor quality information is adding friction to businesses. And only 41% have a formal programme in place to reduce that friction (and even then, a programme does not equate to successful outcomes).

    While there are some signs of the global economy recovering, it is clear that poor quality information will add friction to the mix, potentially slowing down the pace of recovery. And the last thing you need when trying to push uphill is friction working against you. Organisations who have to carry the non-value-adding costs of poor quality information will be unable to reap the “first mover” advantages or seize the “low cost operator” niches in the post melt-down market place. Organisations which have invested in reducing the friction will benefit.

    In my opinion, there is an opportunity right now for information quality professionals to develop some clear messages about the importance of information quality and its value to your organisation and the wider economy.

    • To compete in a “lean” way , organisations are investing in Business Intelligence. Without regard to the underlying quality of the information being pulled together, this can rapidly descend into “Business UNtelligence”. Issues such as missing or incomplete data, or even the existence of “non-standard” characters like apostrophes in surnames or email addresses can cause problems in your BI reporting.
    • To ensure compliance with current and as yet emerging regulations, organisations will need to pay closer attention to the information flows within their walls and between them and their partners. Closer validation of data, increased focus on internal integrity of facts (e.g. does the salary figure on the loan application align with other credit information available to a lender) will likely become more important. These are all information quality based initiatives.
    • Risk Management – a colleague who specialises in Risk Management consulting shared with me recently that “Can’t rely on our information” is a risk that keeps cropping up again and again in his risk workshops with large businesses. This is borne out by an Information Age survey (referenced in Tom Fisher’s new book) which found that 32% of companies who responded cited Risk Management (compliance and regulatory issues) as a key driver of their Information Quality initiatives.
    • Changes in the quality of information only take place through effective management decisions. Either you decided to invest in managing your information quality effectively, or you effectively decide NOT to manage your information assets.

    These are just a few areas where there is friction caused by poor quality information – I welcome suggestions for others.

    By removing or reducing the friction, the information quality expert and their team can help businesses seize new ground or at least hold their own as the global economy recovers slowly. By reducing friction, you reduce the amount of wasted energy that is lost in the form of heat.

    Conversely, if you are trying to get your information quality programme jumpstarted, one good way is to figure out how to focus all that lost heat in one place to start a small fire under someone.

    Just answer the question!

    But back to the question at hand… is Information Quality Management a Recession Proof profession?

    I think the answer is yes and no.

    It is yes in that, insofar as any profession can be recession proof, information quality practitioners and vendors have seemed to weather the storm quite well recently. Furthermore, down turns inevitably focus attention on areas of avoidable cost and waste within organisations. Sensible ones look to remove that cost surgically – a process that in and of itself requires sound information. Ultimately, if a business is trading it is creating and consuming information in order to make or deliver its products. Therefore, even in a down turn there is a role ofr the information quality professional. The relatively high job satisfaction ratings in the IAIDQ’s survey suggest that IQ professionals may have been biding their time in organisations and building their value cases slowly.

    However, the answer is no if we think to what might happen once recovery sets in. In the absence of a crisis, how do we present the value case for continued or renewed investment in information quality? In order to ensure success in the good times we as a profession must convince senior management of the value of reducing information friction in our businesses. While it is easy to point at the pile of rubble and say “If we’d had better quality information we could have avoided that”, it is more challenging to show how those same skills, tools and approaches can build a shiny new edifice on the foundations of that rubble.

    So, information quality management is recession proof, but only if we continue to define and refine the value proposition for better quality information within our organisations and in the wider global context as well.

  • Buzzword Bingo (or “It’s the info quality stupid”)

    One of my fellow Information Quality practitioner-bloggers wrote recently about the emergence of what he labelled “DQ2.0”. You can read Henrik’s original post here.

    While I don’t disagree with many of the points and questions raised by Henrik, I do have  a problem with the use of a label like “DQ2.0” to describe what is ultimately the maturing of a profession and the evolution of an industry.

    My issues with the label are based on commercial grounds (I’m looking to develop a business providing consulting services in the Information Quality space), personal grounds (there are things that just feel ‘hinky’ to me), and from my perspective as a Director of the only Professional Association specifically serving Information Quality professionals.

    Commercial Objections – making the tricky sell even trickier

    While it is tempting to apply labels that align with the latest buzzwords to help grease the wheels of conversation, I have suggest that buzzword phrases inevitably fall into the trap of either being hijacked by vendors (“Our tool is DQ2.0”) or dismissed as yet another fad.

    In my planned business venture I’m targeting SME and upwards for information quality services. Conversations I’ve had to date with some of my target market, and my experience working on IQ projects in a large corporate in my pre-redundancy days,  has been that many business managers or owner-managers don’t quite get the Information Quality thing. A key challenge is to explain why the advice of their accountant or bank manager or an IT consultant about these things may not be the most informed. Adding another layer of lingo, in my commercial view, might only add further to the confusion (“OK… I understand this DQ thing… but I’ve read about this DQ2.0. Can your company sell me that?“).

    Ultimately, within the “DQ2.0” concept we aren’t presenting anything particularly new. The ability to access more authorative reference data sets to help validate and improve quality is not a change, it is just reducing the barriers to adoption and (hopefully) reducing the costs of implementing effective quality processes for information. The increasing adoption of SOA is simply serving to make many of the historically invisible issues of poor data definition and sloppy process design unavoidably visible in businesses.

    Ultimately, these things are just making it easier to develop the case for getting the quality of information managed effectively.

    My personal gut feel

    From a personal perspective, I actually think that putting a version number on Data Quality runs the risk of further compounding the “That’s an IT problem” problem.

    Add to that the on-going debate within the profession about whether the correct labelling is “Information Quality” or “Data Quality” or “Information/Data Quality” or “Derek”, and sticking a version number on the end seems, in my view, to be just a bad idea that invites further confusion.

    After all, we are not talking about a massive paradigm shift in the fundamental thinking of how information quality can be managed or improved. The growth in available reference data sets, often with government approval, is simply an evolution of the market. The increasing awareness of the importance of Information Quality to SOA environments is, again, a maturing of the profession (and perhaps a result of business and IT people actually communicating for once). The increasing awareness of the information quality problems caused by cultural biases in data modelling or process design is perhaps just a by product of me ranting about companies demanding my postcode when my country doesn’t have one (oh, and Graham Rhind may have had an influence too).

    Web2.0 represented a significant shift in the way the interet worked and was interacted with by citizens of the web. However, I don’t see Tim O’Reilly proclaiming a new Web2.x each time a new CMS tool emerges or microblogging platform springs to life or plugin is released for WordPress.

    But using reference data, understanding the impact of technology platforms on information quality (and vice versa) and avoiding biases in design that undermine the quality of information  are not new things or a significant paradigm shift in the Information Quality world. They are some of the fundamental principles and activities that need to be included in any Information Quality project.

    Developing an Information Quality offering for smaller businesses is simply a natural evolution of the profession and a broadening of the market into which professionals seek to offer their services, particularly as there are likely to be a growing number of “hired gun” information quality professionals who have cut their teeth in larger corporates and who will need to work with smaller organisations to develop sustainable businesses. This is not “DQ2.0”, this is simply an evolution of the profession as we reach a critical mass of practitioners who wear the label “Information Quality Professional”.

    From my IAIDQ perspective

    I need to be careful when writing this bit that my words aren’t read as being the definitive IAIDQ position here. This is not an IAIDQ website, it is my personal blog. However, as someone who has been working for the past few years to develop an Association of professionals in the IDQ/IQ/DQ discipline, that role affects my reaction to the “DQ2.0” phrase.

    The fact that we are talking about DQ2.0 indicates that the profession is maturing and we are slowly and steadily creeping into the “mainstream” of thinking. Graham Rhind is correct to point out in his reponse to Henrik’s post that there are different levels of maturity out there. However, this is true of all professions and represents an opportunity for those of us (practitioners, consultants, and professional organisations) to help the less mature climb the ladder.

    However, applying the label “DQ2.0” may not serve the profession or those who we as practitioners seek to help as it creates yet another potential silo and sub-division in the mindspace of people. As already discussed, many of the illustrations Henrik raises in support of a “DQ2.0” are simply elements of a level of information quality maturity, not a new “fork” of the profession or skillsets.

    However, labelling these issues as a “new” 2.0 version of Information Quality does a disservice to range of knowledge areas required to be an effective Information/Data Quality professional. And ultimately, it distracts from the fundamental issue which is the things that need to be done to improve and ensure the quality of data and information.

    It’s the stupid information quality (or words to that effect)

    You can reparse the heading of this section to get either a paraphrasing of Bill Clinton’s famous quote on the US economy or one of the common reasons for 84% of all ERP implementations failing to meet their objectives.

    And this is what it is all about… not whether we are dealing with 0.1 of Data Quality or DQ2.0.

    Yes, use version numbers as milestone markers in an internal programme of work to evolve your organisation up the maturity ladder towards smoothly running Information Quality. But please don’t label the discipline in this way.

    My late grandfather was, amongst other things, a master carpenter and master plasterer. When he started his trades his tools were all hand powered. He did not think of his trade as “carpentry 2.0” the day he bought an electric drill. The fundamental principles of carpentry remained the same. When the trade moved from lat-and-horsehair plastering to gypsum drywall plasterboards, it didn’t change the profession to “Plastering 2.0”.

    The tools and new materials just meant he could do things a little faster, and perhaps a little cheaper. As a jobbing plasterer he also did work for big projects and smaller clients. Having good tools helped him meet their needs faster, but having proven skills and experience in the fundamentals of his trade meant he did a good job for those clients.

    Let’s not play buzzword bingo with the profession. Let’s focus on the fundamentals needed to do a good job and improve the quality of information for all information consumers.

  • IQ in the Real World (a leadership return)

    I recently had to spend some time engaging with an Irish Government agency as a result of my voluntary redundancy from my former employer. Now, while I’ll admit I am perhaps over sensitive to information quality issues, having had a lot of experience with them and having written about them a lot over the years, I do find that I am also a magnet for these things.

    So I was not surprised to learn that, according to the Irish government’s computer, my wife was married to me but I was not married to my wife. The Computer Says No.

    While this took only a second for the very nice and personable civil servant to correct, it does beg these questions:

    1. How was one part of the relationship between my wife and I populated but the other wasn’t? (What process failed)?
    2. How was that incomplete relationship not identified (What checks are performed on the quality/completeness/consistency of information in the Irish Civil Service)?
    3. What down stream systems might have been making incorrect decisions based on that broken relationship (what processes might fail)?
    4. How far might that error have propagated?

    For example, if my wife died (heaven forbid) would I have had difficulty in claiming a widower’s pension because while the computer says she is my wife, it doesn’t say that I’m her husband?

    I was surprised to hear the civil servant complain then about the quality of the information and how it made life difficult. I was doubly surprised when he told me he’d been trying to explain to his boss about how if you set up a database correctly it can help prevent errors.

    Unfortunately, he works in the real world, in the Civil Service. Having had experience with civil service type cultures in the past, my fear is that the enthusiasm that that young civil servant showed for finding and fixing errors and trying to understand the root causes of the problems and how to prevent them will be ground down by management attitudes of “that’s above your pay grade”.

    And so we return to the theme of leadership versus management in the context of information quality. To achieve quality you need to foster a culture where even the lowest member of staff can make suggestions for improvement and can be empowered to lead on their implementation or to find out more about how the problem can be solved.  Waiting for inspiration to strike from on high and trickle down often leaves the crud problems backing up in the process pipelines as the 2 minutes to fix becomes 10 minutes, or (even worse) becomes “oh, I’m not paid to do that”.

    Environments which rigidly enforce and demand respect for the “chain of command” often only find their bottom up leaders during a significant crisis. Think “battlefield promotion” in the context of military matters and you have the closest parallel I can think of (at the moment). Until then, they promote on seniority rather than merit (“Hey Bob, you’re still not dead, so here’s a promotion”) and newer staff members who have ideas that are going in the direction of a solution often get tagged as the “squeaky wheel”.

    However, even in those type of environments, it is possible for the squeaky wheel to have some influence on the thinking of management. It just takes time and perseverance and not a small amount of pure unadulterated pig headed self belief to keep on pushing the question. Eventually the squeaky wheel gets a little oil and, with every win, the squeaky wheel helps the business move smoother and has to squeak less.

    To the young civil servant who corrected that small error on a government file….. Well done. Thank you for your focus on the customer, your sense of humour about the issue, your insight into some of the fundamental issues in Information Quality. I doubt you will read this, but if you do, join the IAIDQ where you can learn from other squeak wheels how to get the oil you need. By being part of a community populated by people who’ve been there and done that, you’ll get the support you need to be pig headed about the need to tackle processes, system design and simple governance to ensure the quality of information in key functions of your organisation.

    Quality is not job one. Meeting or exceeding the expectations of your customers is job one.  Or to put it another way…

    Quality is not Job One (from http://gapingvoid.com)
    Quality is not Job One (from http://gapingvoid.com)
  • #BGas- Bord Gais loses 75000 customer records

    The Bord Gais story

    First off, I am a Bord Gais (Irish Gas Board, now an electricity supplier) customer. I switched to them earlier this year to save money. I provided personal details about myself and my wife along with details of the bank  account our bills get paid out of. So, my wife and I are almost certainly included in the 75000 people who have recently heard about how four laptops were stolen from the Bord Gais HQ two weeks ago, one of which had our personal data on it in an unencrypted form.

    Oh… we are assured it was password protected. Forgive me if I don’t feel the love about that assurance. Passwords were made to be broken, and in my experience they are often not very strong. (“P@ssword”).

    Everything reported in the media thus far suggests to me that this incident stems from yet another chronic failure to recognise the value of the “Information Asset” and treat it with the care and respect that it deserves.

    What do we know?

    • The laptops were stolen in a burglary.

    Unless the burglars had ample time to wander around the headquarters of a blue chip company rifling presses looking for laptops, it would seem to me that the laptops were left on desks unsecured.  A basic practice for the physical security of laptops is to either lock them  away or take them home with you and secure them there. Leaving them sitting on your desk invites larceny.

    • This laptop ‘fell through the cracks’ for installing encryption software

    OK. Mistakes can happen. However a simple check for the existence of encryption software is an obvious preventative control that could have prevented the unencrypted laptop from being put out into use.  Of course, just because there is encryption software on a laptop doesn’t mean that the user will actually encrypt their files in all cases.

    Reliance on policy and technology without ensuring control, culture and people changes are implemented as well (such as changing work practices or giving the lowest techie the right to tell the CEO to bugger off if he wants his laptop before it is encrypted) invites a false and unwarranted sense of security.

    Also, I am aware of one large company which has rolled out encryption on laptops, but only to senior management and primarily to protect documents relating to management strategy. The fact that the proletariat knowledge worker with a laptop can have spreadsheets a-plenty chock full  of personal data doesn’t seem to have registered. They are protecting the wrong asset.

    • The file was password protected

    OK. Two points here… is it the file or the operating system? How secure is the password? If the password is on the file might the password be stored in a text file on the laptop, or in an email, or on a post-it note stuck to the lid?

    Even if the spreadsheet (and inevitably it will be a spreadsheet) is password protected, there are a number of free utilitites for recovering passwords on Microsoft office documents. It took me all of 15 seconds to find some on Google.

    MS Access is a little trickier, but where there is a will (and a basic knowledge of Access) there is a way.

    When it comes to securing personal data, passwords should be seen as the last (and weakest) line of defence.  Passwords, like promises, are all to easy to break.

    • The break in happened 2 weeks ago

    So, what we know from the media is that the thieves (or the people who eventually wound up with the laptops) have had 2 weeks to do the google searches I’ve done to find the tools necessaray to crack a password on a file.

    they’ve had two weeks to go to market with their asset to see what price they can get. They’ve had two weeks to start applying for loans or credit cards.

    What I know from the media now is that Bord Gais is more concerned with the Regulator and the Data Protection Commissioner than they are with their customers.

    What I don’t yet know from the media

    • What the fricking hell was my data doing on a laptop?

    OK,  so I’ll accept that there can be reasons for data to be taken onto laptops or local PCs from time to time (migrations, data profiling, reporting, remediation of compliance issues etc.).

    But ALL the records and ALL the fields in those records? That’s just ridiculous.

    And was that purpose consistent with the purposes for which I provided the data in the first place?

    Having ALL the eggs in one unsecured basket invites loss and security breaches.

    • Was the laptop securely stored or locked in any physical way?

    I have to assume no on this one, but who knows… the theives may just have been very lucky that the first four presses they broke open happened to have laptops in them.

    No amount of software security or business practice will prevent a theft if the actual physical security of the asset is not assured. The asset in this case isn’t the laptop (value no more than €600),  but the data is worht a whole lot more.

    75,0000 records at around €2.00 a record is an easy€150,000.

    • Will Bord Gais compensate customers who suffer loss or damage through their negligence?

    OOOh. Negligence is a strong word. But leaving unencrypted, unsecured data (yes it is password protected but that’s not much comfort) lying around is negligent. If I suffer loss or injury (such as being liable for a debt I didn’t incur or having my credit rating trashed, or having my identity stolen) will Bord Gais compensate me (without me having to sue them first)? (more…)