Category: The Business of Information

  • GPS, Ambulances, and Data Protection–The CSI Effect

    Last week the Irish Times published an article that I can only describe as poorly researched.

    The gist of the article was that ambulance services were finding it difficult to get to the right addresses in time to save people because Data Protection rules don’t allow them to use GPS location of people’s phones.

    Bullshit Improperly researched. Section 8 of the Data Protection Acts permits processing of and disclosure of data in exactly these circumstances, either under Section 8(d) (where there is a risk to the life or well being of a data subject) or Section 8(f) of the Data Protection Acts 1988 and 2003. This is a fact that so obscured by “The Man” that it can only be uncovered in one of five ways:

    1. Looking up the text of the Data Protection Acts on the DPC website (www.dataprotection.ie)
    2. Putting in a query to the Data Protection Commissioner’s Press Office.
    3. Contacting one of the various Data Protection consultants and trainers who frequent social media (*koff*)
    4. Asking a lawyer with Data Protection expertise
    5. Contact Digital Rights Ireland

    At no time would the journalist have to venture into a multistorey carpark to meet a shadowy figure to uncover the truth.

    SI336 introduced some elements about the use of location data requiring consent, but that was in the context of processing for commercial purposes and would be superseded by the Section 8 exemption in practice.

    The Science Bit

    That’s the law bit. Now the science bit. Get ready…

    Life is not like CSI.

    Everyone’s experience of GPS is that you have a device in your hand and the device knows where you are. that’s because the device is communicating with a satellite (and in the case of a mobile phone what ever cell towers are in range and potentially wifi networks you are connected to) to triangulate your location.

    But that takes place ON YOUR DEVICE. The telcos don’t actually have that data (but Apple would if you have an iPhone and Google do if you run Android). Telcos would need to have an app installed to access that functionality of the phone and relay it to their data centres.

    They can, however, extrapolate your data based on your nearest cell towers when you make or receive calls or texts. But that data is NOT PROCESSED REAL TIME, other than at a very low level in the network. Accessing it for law enforcement purposes can be a laborious task taking a number of days. Accessing it for the telco’s own uses happens in batch mode as well.  Also it is NOT ACCURATE – it can be as imprecise as up to a 5km radius in rural areas (do your sums on that to find the total area) and sometimes it is way off (I’ve been billed for data usage on a device that was in the UK but was not registered as having been roaming). So you stand a reasonable chance of getting the right county, but not always.

    Yes, location based targetting of messages can occur but that happens by the telco setting up a geo-fence based on an area around one or more cell towers and then pinging your details when your phone becomes active within the cell (a basic network level activity). They are waiting for you based on defined criteria. And it is not a real-time monitoring. They can’t easily identify where you are at a point in time. They can identify that you have come into a defined area they are monitoring.

    Real-time pinging of mobile devices based on cell tower locations is technically possible but it isn’t easily done and can require a lot of resources.

    If the Ambulance services wanted GPS data there is an easy way to get it. Develop a smartphone app for calling the emergency services that will relay device information and GPS co-ordinates and perhaps other data (like photos of the injured party or accident scene). Just like calling a Hailo cab. Of course that will only work for the people with a smartphone who have the app, who remember they have it, and who know how to use it.

    But please don’t make up stuff about it being the Data Protection Act’s fault.

    And do bear in mind that not every phone in the country is a smartphone with GPS capability. Landlines still exist and are used, and basic mobile phones lack GPS functionality.

    More Science

    Apart from “Hailo for Ambulances” (which I think I’ll have to go and patent really quickly),  postcodes would be a better solution to avoid ambulances being sent to the wrong places (promised since 2008). Or using MPRN numbers from ESB meters in buildings where available, which allow addresses to be resolved reasonably accurately from the ESB Network.

    Each of these have data protection issues, but not in the context of emergency services. Section 8 of the Data Protection Acts takes care of that.

    More Journalism

    So, can we please stop reprinting press releases without fact checking please Irish Times. 30 seconds. That’s all it would take. A call, an email, a DM on Twitter…

    But it would avoid you being st00ged into printing utter nonsense about Data Protection stuff which you WILL be called on publically by the very people who could have told you to tread carefully if you’d asked.

  • Compliance, Culture, and Tone at the Top

    The Data Protection Commissioner has just published his annual report. It makes (as always) interesting reading. It has only been released in the last 30 minutes but there are elements of it that I will return to in detail in a post on my company website later this week (once digested).

    Over here on my personal blog I thought I’d pick up on a broader question that bubbles up frequently in Data Governance and Compliance, not least in Data Protection. That is the importance of “tone at the top”.

    The DPC’s Annual report instances a number of breaches by way of case study and a report on an audit follow up. The audit of An Garda Siochana’s Pulse system is mentioned in dispatches. Which brings me to the troubling topic of this post.

    The Minister for Justice and Defence has disclosed into the public domain information about another person (a Data Subject) relating to an alleged (and disputed)  “stop and caution” which came into his possession one must assume in the course of his ministerial function from some, as yet unconfirmed, source. The Minister sees nothing wrong with disclosing personal data, and in this case potentially sensitive personal data, for his own purposes. He has stated in his defence that he felt the disclosure was in the “Public Interest”. His Taoiseach has backed him in his actions.

    It brings to mind the argument put forward by Nixon when challenged by David Frost about the legality of certain actions. Just because you can do something doesn’t mean you should – this is the long repeated mantra of Data Protection practitioners world wide.

    To extrapolate a little: a senior member of the executive management of an organisation has disclosed publicly information about another person that has come into their possession through the course of their professional activities. The disclosure is without a clear lawful purpose but the manager feels it is in the Public Interest to know the kind of person they are dealing with (“Public Interest” and “Of Interest to the Public” are two different concepts). The manager sees nothing wrong with this. His CEO sees nothing wrong with this and backs the manager.

    If this was a private organisation the DPC would be investigating and the executive and CEO would potentially be facing personal liability under section 29 for their consent and connivance in the commission of an offence under the Data Protection Acts.

    When the Minister for Justice and Defence, under whose Department the Data Protection Acts reside, cannot recognise where the political win that comes from dropping the other guy in it because you have a information about them others don’t have runs into conflict with fundamental rights to Personal Data Privacy and the Data Protection Acts themselves, the tone at the top is resonating bum notes.

    When the Taoiseach sees no problem with this, the bum notes become cacophonic.

    If the Minister is to argue that business owners and public servants should respect the law then his recent actions inject a diminished minor note to the fanfare he should have around Data Protection, what with him being the Minister in Europe charged currently with shepherding the revised Data Protection Regulation to a final text.

    Why should an SME owner or CEO of a large corporate challenged with respecting the Data Protection Acts seek now to act in compliance with that legislation? The Minister can flaunt it, why not them? Why should a young garda officer on the beat, struggling to make the mortgage payment that month, respect the Data Protection Acts and their code of conduct under the Garda Siochana Act when offered money by external entities for information when the Minister can unthinkingly ignore the ethics and letter of the legislation in pursuit of political point scoring?

    Over two years ago I wrote about the same issues arising in a political context. One key quote sticks out in the context of the current situation with regard to political leaders:

    If they are promoting a “tough on regulation” policy platform, then they must lead with a clear “tone from the top” of Compliance and good Governance.

    The Office of the Data Protection Commissioner is intended to be independent and is required under the TFEU to be so. However they operate under the auspices of the Department of Justice and Defence. In such a structure there is a significant risk that the clarinet solo of the Commissioner (still grossly under resourced) will be drowned out by the cacophonic discord of the Tone at the Top.

    The DPC has commented today that:

    in general the public sector, including ministers, has a solemn duty to protect any personal data coming into its possession and may only disclose it when it has the consent of the individual concerned or under another basis laid down in law.

    Should Deputy Mick Wallace make, or have made, a complaint it would be interesting to see how the Government and the Commissioner’s office would act to avoid the kind of enforcement actions related to a lack of independence of the Regulatory Authority (the DPC) that have arisen in Hungary. Certainly it would be a matter worthy of mention in the 2013 Annual Report of the DPC. Bluntly it would represent a clear test of the independence of the Office of the Commissioner (and for that matter their resourcing) if they were to have to investigate a Minister rather than just a Minister’s department or agency.

    The Data Protection Acts owe their genesis and evolution to the actions of political forces in Europe in the early years of the 20th century. We should be very worried when the Minister responsible for internal and external State Security feels that encroaching on a right to privacy without a clear lawful purpose is an acceptable political tool.

    The Tone at the Top is braying some bum notes this week and some conductor needs to bring the orchestra back in tune. Otherwise, like all great bands, musical differences may trigger the beginning of the end.

    I had looked forward to the Minister’s statement today but am underwhelmed by his position that Deputy Wallace’s status as a public personality is a justification for the disclosure. Where this the case, the DPC audit of the Gardaí and Dept of Social Protection would not have been so concerned about access of data relating to celebrities. The disconnect and discord in the “tone at the top” is palpable!

    (If the use of the phrase “tone at the top” in connection with Fine Gael and Data Protection is familiar to some, this post might refresh your memory)

  • Insolvency Register–some quick thoughts

    So, David Hall is challenging the provisions of the Personal Insolvency Act regarding the publication of details on public registers. I’m quoted in this Irish Times article about it. My comments, which I expand on here as an update to my earlier post, where to the effect that:

    • The publication of detailed personal data on a publicly accessible register would invite the risk of identity theft in the absence of any appropriate controls over the access to that data.

    Examples of public registers where controls are in place are the Electoral Register (search one name and address at a time), and the Companies Registration Office (find out the home addresses of Directors if you pay a small admin fee), or the list of Revenue Tax defaulters (publication only over a threshold, summary personal data published).

    Public does not mean Open. Public means that it should be able to be accessed, subject to appropriate controls. The requirement to name people who are in an insolvency arrangement needs to be balanced against their right to personal data privacy and the risk of identity theft or fraud through the use of published personal data.

    The mockup Register entries presented on the ISI website may do the organisation a disservice with the level of data they suggest would be included and I await the publication of further revisions and the implementation of a control mechanism to introduce balance between the requirement to publish a Register and the need to protect personal data privacy. But of course, Section 133 of the Personal Insolvency Act is silent as to what the actual content of the published Registers should be (at least as far as I can see). So there is scope for some haggling over the content of what the final Registers will be.

    A key question to be considered here is what is the purpose of the Registers and what is the minimum data that would be adequate and relevant to be provided on a Register to meet that purpose.

    Section 133(4) allows for the public to “inspect a Register at all reasonable times" and to take extracts or copies of entries, and even allows for a small fee to be charged (the “reasonable cost of making a copy”). So there is scope for some form of access control to be put in place either with a search mechanism like the electoral register and/or the operation of a paywall for the making of copies (e.g. generating a pdf report on headed paper, at €1 a go).

    • Section 186 of the Personal Insolvency Act needs to be interpreted and applied with care.

    Section 186 of the Personal Insolvency Act purports to suspend the operation of Section 4 of the Data Protection Acts in certain circumstances. This is the section which allows a Data Subject to request a copy of their personal data. This is a basic right under the Acts.

    However the Data Protection Acts already contain provisions which allow for the suspension of Section 4 in Section 5 of the Data Protection Acts. Specifically Section 5(1)(d) allows for an exclusion for data which is being processed in the performance of a statutory function intended

    …to protect members of the public against financial loss occasioned by

    i) dishonesty, incompetence, or malpractice on the part of persons concerned in the provision of banking, insurance, investment or other financial services or in the management of companies or similar organisations

    ii) the conduct of persons who have at any time been adjudicated bankrupt

    in any case where the application of that section would be likely to prejudice the proper performance of any of those functions.

    The operation of the Insolvency Service of Ireland would appear to fall under this section. But rather than a blanket exclusion, Section 5 has a more nuanced approach – you can’t have your data if it will prejudice the proper performance of the ISI’s role. Of course, 5(1)(d) only kicks in if there has been dishonesty, incompetence, or malpractice on the part of a bank that has resulted in a financial loss or risk of financial loss to the Data Subject.

    Section 5 gives a number of other grounds for exclusion from the operation of Section 4. Among them are:

    • If disclosing the data is contrary to the interests of protection the international relations of the State (which would raise an eyebrow I’m sure if cited in an insolvency situation).
    • If legal privilege attaches to the records in the case of communications between clients and legal advisers.

    If the restriction is on disclosure of personal data during the course of an investigation then this would likely be covered under Section 5(1)(a ) and there is legislative precedent in the Property Services (Regulation) Act 2011 to extend that to an investigation undertaken by the PRA under that Act.

    An explanation and clarification?

    The ISI has similar powers of investigation and prosecution of offences (section 180 and Chapter 5 of the Personal Insolvency Act 2012). Therefore the exemption from disclosure under Section 5(1)(a ) would apply. A “belt and braces” inclusion of an exemption from section 4 of the DPA for the investigation of offences would be consistent with the Acts.

    However this would only be the case for the investigation of an offence. The processing of a general complaint would not fall within the scope of an offence under the Insolvency Act or other legislation.

    Therefore a blanket opt out would not exist. If an offence is suspected Section 186 reinforces the existing provisions of the Data Protection Acts. But general complaints to the Complaints committee would (based on my reading) not, unless the complaint wound up in an offence being detected. Of course a Data Subject would only be entitled to their own data.

    A recent case involving the DPC and Dublin Bus made it clear that the potential for civil proceedings or a complaint were not grounds to refuse a Subject Access Request.

    • Excessive Retention of Data on Public Registers is a concern.

    This, of course, is another biggie from a Data Protection point of view.How long does this data need to be held for? In the UK similar schemes have the personal data removed from the public register 3 months after the debtor exits the scheme. Here…

    Section 170 of the Personal Insolvency Act indicates that Personal Insolvency Practitioners will need to retain data for 6 years after the “completion of the activity to which the record relates”. This is consistent with the statute of limitations on a debt and makes sense – it would allow people who avail of an Arrangement to get access to information about their arrangement if required. However it is not the same as the Public Registers.

    Section 133 sets out the provisions relating to the Registers of Insolvency Arrangements. It says nothing about the length of time a person’s data will be listed on a Register. Given the purpose is to maintain a searchable register of people who are in Insolvency Arrangements, the principle of not retaining data for longer than it is required for a stated purpose kicks in.

    And, as is all to often the case in Irish legislation, we seem to be left looking to the UK for a benchmark period for retention: Duration of Arrangement plus 3 months… but that may be 3 months longer than required.

    • Personal Solvency Practitioners acting as Data Processors, and the implications for security and awareness of obligations under the Data Protection Acts

    This is a squeaky wheel issue in many respects. All too often organsiations will outsource functions or engage people to perform functions on their behalf on contract, which would set out the purposes of the processing and the role of the Processor and sanctions for breaching their obligations. The Personal Insolvency Act sets out how Personal Insolvency Practitioners will be appointed, empowers the ISI to set standards re: their level of education and skill, and imposes sanctions for breaches of the standards of conduct of the role.

    The function of a PIP is one which could have been undertaken internally within the ISI but it has been decided to outsource it to these PIPs.

    Therefore a PIP is likely to be viewed as a Data Processor acting on behalf of the Data Controller (ISI) [for more on this read here]. Therefore they need to be taking (at a minimum) appropriate security measures to prevent unauthorised access to data. The concern I expressed in the article was that it is an unknown quantity what level of understanding of their obligations under the Data Protection Acts a PIP will have and what training (if any) will be provided.

    Section 161(c) of the Personal Insolvency Act 2012 provides a mechanism for this to be addressed through the prescribing of the completion of appropriate training from a qualified trainer with a proficiency in Data Protection as one of the training requirements for authorisation as a PIP.

    [Disclosure: my company provides an extensive range of Data Protection compliance review and training services]

  • Trust us. We’re the Government

    Coverage of some of the structures of the Insolvency Service of Ireland has been rattling through my ears while I work the past few days. What I’ve heard gives rise to an unsettling feeling that the architects of the scheme have decided that the insolvent are a form of unter-mensch for whom some of the fundamental rights that EU citizens enjoy are either put on hold or entirely foregone.

    Data protection is a fundamental right in Europe, enshrined in Article 8 of the Charter of Fundamental Rights of the European Union, as well as in Article 16(1) of the Treaty on the Functioning of the European Union (TFEU). As a fundamental right, according to the EU Commission it “needs to be protected accordingly”.

    Some of what I have heard I can only hope is half-informed speculation, but I fear it may be grounded in reality.

    1. Publication of personal data including name, address, and date of birth on a public register of insolvents. This is problematic as it creates a risk of identity theft in my view. Also – what is the purpose for which this data is being published? How could the same objective be met without putting personal data privacy at risk of unauthorised access? How is this compatible with s2(d) of the Data Protection Acts which require appropriate measures to be taken to keep data safe and secure?
    2. Retention of data on the register after a scheme has been exited. It is rumoured that the details of people listed on the register mentioned above would have their details retained indefinitely. Why? How is this compatible with the requirement under the Data Protection Acts (and the underlying Directive) to retain data no longer than is necessary for the purpose? How would it be compatible with the requirement under the proposed General Regulation for Data Protection to give citizens of the EU a “Right to be forgotten”? What is the function/purpose of retaining information once the agreed scheme has been completed?
    3. Section 186 of the legislation purports to exempt the Agency from Section 4 of the Data Protection Acts. This is the section that allows individuals to get copies of information held about them by Data Controllers. It is a right that is derived from Directive 95/46/EC. While there are grounds under Article 13 of the Directive for a member state to limit subject access requests where it impacts economic or financial interest of the State, I’m at a loss to see how a response to a Subject Access Request for a single person or class of people might impact our economic and financial interests as a State. The test is that the restriction must be necessary not nice to have. Of course, if things are so precarious that a Subject Access Request will tip the economy into a death spiral, then perhaps the Irish people should be told this.

     

    There is a significant imbalance in rights and duties emerging here. Particularly when compared with the secrecy of NAMA and the closeness with which the privacy of significant contributors to the exuberance of the Boom times has been guarded by that Agency. There is also a suggestion that Data Protection rights are optional extras that can be mortgaged as part of entering the process.

    I really do hope I’m wrong about all of this and it is not the data black hole that it appears to be and that personal data privacy will continue to be respected as a fundamental right. After all, when you’ve lost everything else, things like that can be very important.

  • Wrong thinking about Devices

    I’m addicted to the think. Every day, when not thoroughly occupied with the challenges of a client strategy or issue, I find myself drawn to hard thinking. Sometimes I even get people plying me with think.

    Like this past few weeks. Lots of think.

    One thing I’ve been asked to think about is the whole area of Bring Your Own Device, colloquially known as “BYOD”. I understand that this emerged as a term because people hoped that enterprise technology management would be a lot like a college house party. You’d bring a bottle and go home with two bottles of something better than you went with. Which in tech terms might be going with an Android JellyBean device and coming home with an iPhone and a Windows 8 slate.

    But everyone is wrong. The focus is wrong. Because we have in effect focussed on the size, colour, shape, and label of the bottles in our BYOD/BYOB thinking. In doing so we’ve missed the importance of what is in those bottles. Which is important if you find out that you’ve arrived home from your party with two bottles of water when you had been expecting vodka.

    From a process and governance perspective what we are actually dealing with is a classically simple issue that has just been obscured because:

    1. In the old days the company gave you your bottle and you where damn glad to have one (i.e. they provided the technology you used to do things)
    2. We entered the hooplah hype cycle at the time when everyone was jumping up and down like 5 year olds on Christmas morning when they find Santa has left them a bike. – “YAY!!!! TOYS!!!!!

    What we are actually dealing with is a problem not of how to allow people to use their devices but rather a problem of how to give people access to resources in a secure and controlled manner when we don’t own the bottles any more. This requires organisations to do some thinking. What can be done to ensure that people are given access to resources in the right way?

    Some thoughts spring to mind:

    1. Define standards for the bottle (the device) you will let people bring to the barrel to be filled with yummy data/booze. Provide data in 1 litre chunks, or require 32GB capacity and perhaps limit the OS versions you’ll allow
    2. Put a bottle in their pocket: Implement a standard workspace that sits on the device that you can control the parameters of.
    3. Sell them the bottles (i.e stick with only allowing approved company issue devices).

    Of course, the world is a complicated place so when people start using their own device for work purposes it means there is a risk that the red wine you are giving them for work will be mixed with the white wine of their private personal world. That means the practice of giving them a bottle that is marked “WORK” would be sensible.

    By reframing the thinking away from the fact that they are bringing a device to the party but instead looking at how access to data, applications, and other resources will be provided to n variants of platform the organisation can begin to think strategically without getting bogged down in detail.

    It also gives a great branding opportunity for the strategy. This is a strategy for GIVING ACCESS TO OUR RESOURCES. Abbreviated it is a GATOR Strategy.

    So, does your organisation have a GATOR strategy yet? If not, you should really get one. And make it snappy.

  • Heel Pricks. A short thought

    Yes. It is a pity that Guthrie cards will be destroyed. Yes, there is potentially valuable data held on them. But there is also a fundamental right to Personal Data Privacy under EU Treaties and there is that pesky thing called the Data Protection Acts/Data Protection Directive.

    The DPC investigated the issue of heel prick cards. They negotiated with the HSE to determine a “best fit” solution that struck an uneasy and far from ideal balance between the desire to have a genetic databank and the need to have specific explicit informed consent for the processing of sensitive personal data in that way.

    Comments today from Minister Kathleen Lynch that this needs to be looked at again and efforts are underway to prevent the destruction are baffling. “Efforts are underway”? So the Department is actively working to undermine the role and independence of the DPC? Is new legislation being prepared with retrospective effect that will be passed by the end of next week? Is data being anonymised (tricky with genetic data)? Is the HSE going to do a big push to get people to request the cards relating to them and/or their children from the HSE?

    What needs to be looked at in my view is the culture and ethos around managing personal data that pervades in some areas of political and civil society. For that is where the root and origin of this dismal scenario lies. (A scenario, as an aside, that has faced private sector organisations with their customer databases on a number of occasions: not obtained lawfully, not obtained for that purpose, destroy it.)

    The reason the issue arises with the heel prick tests is that consent was obtained for the processing of blood samples for a very specific purpose – testing for metabolic disorders in neonatal contexts. The consent obtained was for that purpose. No other. Sensitive personal data must be processed on the basis of specific, explicit informed consent. There appears to have been no plan for maintaining the data associated with those samples or for managing the process of obtaining consent for future purposes (or enacting legislation to allow for future purposes without requiring consent). There appears to have been an assumption that these samples could be retained ad infinitum and used for purposes undisclosed, unimagined, or unavailable at the time the samples were originally taken. This was, and is, not the case under Data Protection law.

    As an Information Quality practitioner, I am bemused by the optimism that is expressed that the heel prick data would be useable in all cases. What processes are in place to link the data on the Guthrie card to an identifiable individual? Do those processes take account of the person moving house, their parents marrying, divorcing, remarrying (and the name changes that ensue), or the family emigrating? If the Information Governance in the HSE is such that this is rock solid data then great. I’m running a conference and want good case studies… call me!

    The quality of information angle is important as it raises a second Data Protection headache – adequacy of information. If the information associated with the actual blood tests is not accurate, up to date, and adequate then a further two principles of the Data Protection Acts come into play.

    Yes the destruction of Guthrie cards is a problem (but as Ireland has been doing Guthrie tests since 1966 it has happened before. Yes it is an unsatisfactory situation (but one that appears unavoidable given the legal situation). But the root cause is not the Data Protection Acts or the DPC. The root cause is a failure in how we (as a society) think about information and its life cycle, particularly in Government and Public sector organisations. A root cause is a failure of governance and government to understand the legal, ethical, and practical trade offs that are required when processing personal data, particularly sensitive personal data. A root cause is the failure to anticipate the issues and identify potential solutions before a crisis.

    RTE reports that the Minister describes the 12% awareness level of the right to have cards returned to families rather than destroyed as “telling”. But what does it tell us? Does it tell us people don’t care? Or does it tell us that the HSE awareness campaign was ineffective? I would go with the latter. Frankly the lack of information has been stunning and, as always in Irish life, there is now a moral panic in the fortnight before the deadline. And again, the governance of how we communicate about information and information rights is called into question here.

    I haven’t seen any data on how often the Guthrie card data was being used for research purposes. I’m sure some exists somewhere. Those arguing for the records to be saved should go beyond anecdote and rhetoric and present some evidence of just how useful this resource has been. We need to move beyond sound-bite and get down to some evidence based data science and evidence driven policy making.

    Storing the samples takes physical and economic resource, two things in short supply in the HSE. Storing them ad infinitum without purpose “just in case” creates legal issues. Legally the purpose for which the samples was originally taken has expired. By giving families the option of having the cards returned to them the HSE creates the opportunity for specific informed consent to future testing, while removing the other data protection compliance duties for those records from themselves.

    The choice is not an easy one but the Data Protection mantra is “just because you can doesn’t mean you should”. And just because you have to doesn’t mean it is easy or without pain. But by clearly drawing a line in the sand between non-compliant and compliant practices the HSE avoids the risk of future processing being challenged either to the DPC or the ECJ (after all, this is a fundamental human right to data privacy we are dealing with).

    Hard cases make bad laws is the old saying. However the corollary is that often good laws lead to hard cases where society needs to accept errors of the past, take short term pain, identify medium and long term solutions, and move on in a compliant and valid manner.

    Rather than weeping and gnashing teeth over a decision that is done and past it would behove the Minister and our elected representatives more to focus their efforts on ensuring that the correct governance structures, mind-sets, knowledge, training, and philosophy are developed and put in place to ensure we never find ourselves faced with an unsatisfactory choice arising from a failure to govern an information asset.

  • The sound of one bell clapping

    Twitter is great. I found myself this evening discussing the psychology of alarms with Rob Karel of informatica. He had tweeted that a car alarm outside his office had been going off for an hour but his brain had filtered it out. This is not an uncommon reaction to bells and alarms and is the reason why I have a monitored alarm system in my home, a fact I will return to later.

    Our neuropsychological response to alarms is pretty much the same as our response to any alert to risk. It is influenced by the same basic flaws in information processing in the limbic system of the brain, our “lizard brain”. If the danger is not one we are familiar with and it is not immediate we discount it to the point of ignoring it.

    An alarm going off is an alert that something is happening somewhere else to someone or something else. Without a hook to make it personal it is just noise and it fades into the background. In the absence of a direct effect on us we tune out the distraction so our lizard brain can focus on other immediate risks – to us. An alarm = someone else’s stuff at risk.

    This is why a measure of data quality needs to have an action plan associated with it so that the people in the organisation can tie the metric to a real affect and put a clear response plan into action. Just as how when a fire alarm goes off we know to go to the nearest exit and leave belongings behind or just as we know that if an oxygen mask drops in front of us on a plane we should tug hard and take care of our own mask first.

    There is an alarm stimulus. There is a planned response that makes it personal to us. Alarm, something must be done, this is a thing, let’s do it.

    But often Information Quality scorecards are left hanging. The measure of success is the success of measurement. Just as the measure of home security is often whether you have a house alarm. But a ringing alarm that has no action to be called to serves no purpose.

    My home has a monitored alarm. If one sensor is triggered I get a phone call to check on me and alert me. If a perimeter sensor and an internal sensor are triggered together I get a call to let me know that there are police en route. Each time the alarm is responded to by a stranger with a planned response. My role is to cry halt at any time, gather data about the incident (was there someone calling to house who forgot alarm code? Is there a key holder on the way?), and generally coordinate the plan’s roll out.

    What can we learn from this for how we design DG and IQ strategies? What is your planned response to an alarm bell ringing in your data?

  • Striking a balance in Data Protection Sanctions

    It was reported yesterday that the Irish Government has issued a “discussion paper” on the proposed administrative sanctions under the new Data Protection Regulation.

    EDRI has criticised the proposals with reference to the “warning/dialogue/enforcement” approach taken by the Irish DPC. Billy Hawkes has, in the past, been at pains to clarify that the Irish DPC uses dialogue to encourage compliance and also seeks to encourage organisations to raise questions and issues with the DPC to avoid breaches. There is a belief that the “brand impact” of even being spoken to by the DPC about an issue can prompt “road to Damascus” conversions in organisations.

    That is all well and good, but my experience working with organisations is that this can result in management playing a game of “mental discounting” (I’ve written about this before in response to the original draft DP Regulation). If there is a perception that the probability of an actual penalty is low, there is little leverage in appealing to intrinsic motivation of a business manager when his extrinsic drivers for behaviour are pushing the decision towards a “suck it and see” approach.

    Having re-read the discussion paper and EDRI’s response to it I can’t help feel that EDRI may be over-stating the “ask” that is being made here a small bit. They cite it as the “destruction of the right to privacy”, citing the Irish DPC’s own experiences with the Garda Pulse system which has been plagued by reports of breaches in Data Protection since its introduction, despite the Gardaí having a statutory Code of Practice for Data Protection. In 2010 the DPC reported that that Code of Practice was not being implemented in the Gardaí.

    However, this says as much to to me about the attitude to Data Protection in some (but not all) parts of the Irish Public Service then it does about the merits of the Data Protection Commissioner’s approach to encouraging compliance or the specifics of anything that might be discussed on foot of this discussion paper. Furthermore it raises questions for me about the capability and resources that the Data Protection Commissioner has to execute their function effectively in Ireland, and even suggests that there may be informal barriers to the effective operation of their function in the public sector which need to be urgently considered (given that the Office of the DPC is supposed to be independent).

    Given the extent of the negative findings in the interim report on the 2012 audit of the PULSE system I personally would hope that there would be some level of penalty for the Garda Siochana for failing to follow their own code of practice. But that is a different issue to what the Discussion paper actually raises.

    What is being discussed (and what would I like them to consider?)

    The Discussion Paper that was circulated invites Ministers at an Informal Council meeting to consider (amongst other things):

    1. If wider provision should be made for warnings or reprimands, making fines optional or at least conditional upon a prior warning or reprimand;
    2. if supervisory authorities should be permitted to take other mitigating factors, such as adherence to an approved code of conduct or a privacy seal or mark, in to account when determining sanctions.

    It flags the fact that the Regulation, as drafted, allows for no discretion in terms of the levying of a penalty. What is proposed here in the discussion is a discussion of whether warnings or the making of fines optional would be the mechanism to go to rather than scaring the bejesus out of people with massive fines. This in itself doesn’t kill the right to Privacy, but it does potentially create the environment where the fundamental Right to Privacy will die, starved of any oxygen of effective enforcement.

    Bluntly – when faced with a toothless framework of warnings and vague threats, businesses and public sector bodies will (and currently do) play a game of mental discounting where the bottom line impact (in terms of making money or achieving a particular goal) outweigh the other needs and requirements of society. So an organisation may choose to obtain information unfairly or process it for an undisclosed secondary purpose because it will hit its target in this quarter and the potential monetary impact won’t emerge for many more months or years, after an iterative cycle of warnings. The big penalty will be seen as something “far away” that can be worried about later. After everyone’s got their bonuses or their promotions etc.

    If strict statutory liability is the model that is being proposed, and the discussion is to look at watering it down to a stern talking to as a matter of formal policy in the Regulation, I must despair of the wingnuts in my government who even thought that would be a good idea to even suggest this. But I do agree that tying the hands of the Regulators to the big ticket monetary penalties might not work in their interests or in the interests of encouraging compliance with the legislation.

    What is needed is a middle ground. A mechanism whereby organisations can make errors of judgement and be warned, but that the warning will have some sanction with it. The sanction needs to be non-negotiable. But it needs to be transparent and obvious that this is what will happen if you ignore DP rules. It needs to be easily enforced and managed. There should be a right of appeal, but appealing the non-negotiable fixed-penalty should carry with it the risk of greater penalties. And the ability of an organisation to benefit from iterative small penalties should be removed if they are a recidivist offender.

    There is a system that operates like this in most EU countries – it is the Penalty Points system for motoring offences. Hopefully the discussion will move to looking at how a similar system might be implemented for Data Protection offences. The penalties could be tiered (e.g. no cookies notification – €150 fine and 2 points on first offence, €500 and 4 points on second, failure to document processing €500 fine on first offence and 6 points). The points could be cumulative, with the “optionality” of higher sanctions being removed if you were, for example, an organisation with 100 points against you (congratulations, you’ve failed to up your game and now you are being prosecuted for the full tariff). Organisations bidding for public sector contracts could be required to have a “Data Protection Points” score below a certain level.

    This system could be devised in a way that would take account of mitigating factors. If a code of practice was entered in to, and was successfully audited against by an appropriate body, then points could be removed from the “scorecard” at the end of a 12 month period. If there were mitigating factors, a lower level category of offence might actually apply (I’ll admit I’m not sure how that might work in practice and need to think it through myself a little). Perhaps self-notification to the DPC, engagement in codes of practice, mitigating factors or actions etc. would carry a “bonus points” element which could be used to off-set the points total being carried by a Data Controller (e.g. “adopted code of practice and passed audit: minus 3 points, introduced training and has demonstrated improved staff knowledge: minus 3 points).

    Certain categories of breach might be exempt from mitigation, and certain categories of offence, just like with motoring offences, might be a permanent black mark on the organisation’s Data Protection record (e.g.: Failure to engage with DPC in an investigation, failing to take actions on foot of an audit/investigation).

    The scheme could be administered at an EU level by the EDPB, with the points accumulated by organisations operating in multiple member states either being cumulative or averaged based on a standardised list of key offences. Member States could be free to add additional offences to this list locally, within the spirit and intent of the Regulation.

    That would be an innovative idea, based on a model that has been proven to have an influence on compliance behaviour in motoring. And it would provide a transparent mechanism that would ensure that warnings could be given, advice could be sought, and positive engagement could be entered into by Micro Enterprises, SMEs, and large corporates. It would provide a relatively low impact mechanism for levying and collecting penalties from organisations who are in breach (penalties could potentially be collected as part of annual tax returns as a debt owed to the State), and it could be used to reward organisations who are taking positive actions (“bonus points”).

    Finally, it would give the basis of a transparent scorecard for organisations seeking to evaluate data processors or other service providers (in the same way as Insurance providers use penalty points data for motoring to assess driver risk), and it would give a clear escalation path to the full sanctions in the Regulation (e.g. 100 points and you go straight to full penalties).

    What it does not give is a death spiral of warnings that don’t amount to penalty and as a result give a platform for organisations to ignore the Right to Privacy. It is an evolution of the conciliatory approach to encouraging compliance but one that is given teeth in a manner that can be transparent, easily explained, and standardised across the EU27.

    I’ve written about this in 2010 and 2012. Maybe the time is right for it to be discussed?

  • Call the Tweet Police (a slight return)

    An opinion piece by Joe Humphreys in the Irish Times on the 9th of January (which I can link to here thanks to the great work of McGarr Solicitors) discusses anonymous comment on-line. In doing so he presents an argument that would appear to suggest that persons taking a nom de plume in debate are in some way sinister and not trustworthy.

    He suggests three actions that can be taken to challenge “trolling”. I’ve previously addressed this topic on this blog (27th December 2012 and previously) I thought I’d examine each of Mr Humphrey’s suggestions in turn and provide agreement or counter argument as appropriate.

    1. Publicly condemn it. Overall I agree with this. However who or what should be condemned? The pseudonymous comment or the pseudonymous commenter? Should you ‘play the man or the ball’, to borrow a metaphor from sports? The answer is that, in an open society the correct course of action is to either ignore the argument or join the argument. Anything else leads to a downward spiral of tit-for-tat trolling and abuse, one of the very behaviours that has sections of our body politic and mainstream media crying “Down with this sort of thing!”

    2. “Develop ways of discriminating against it… … by technology that helps to authenticate people’s identities”. In my blog post of the 27th of December I address this under the heading of “Bad Idea #1”. The concept of identity is incredibly fluid. As Mr Humphreys appears fond of citing scientists and philosophers, I’m sure he is familiar with Descarte’s writings on the existentialist concepts of identity.

    The idea of an “identity register” is one that raises significant technical, philosophical, and legal issues. South Korea has recently abandoned their attempts to impose a “Real Names” policy on the use of social media due to these issues, and “Real Name” policies in social media have been criticised on Data Protection grounds in Europe. In China, where a “real names” policy is in place for social media, people use fake ID to register and the Chinese government has failed to get a significant majority of internet users to comply with their law.

    Describing anonymity as a “market failure” to be fixed by enforced identification equates identity with a tradable commodity. This is, ironically, the business model of Facebook, which Mr Humphreys describes as “an invention of Orwellian proportions”.

    3. “Challenge the anonymous to explain why they are hiding themselves. I’ve yet to hear a good excuse…” In my post of the 27th of December I link to an excellent resource (the GeekFeminism Wiki) which lists a number of reasons why people might not be able to use their real names in on-line comment. Time taken to research this: 30 seconds on Google. They include: survivors of abuse, whistleblowers, law enforcement personnel, and union activists.

    The implication made by Mr Humphreys that people choose to comment anonymously because they don’t want their employer to know they are on social media all day is disingenuous to say the least and belies a biased view of those of us who are active users of modern technologies for communication, discussion, and debate.

    Finally, history has a litany of examples of people who, for various reasons have used pen names to hide themselves. From Leslie Charles Bowyer-Yin (Leslie Charteris, author of The Saint) to Samuel Langhorne Clemens (Mark Twain), to Francois-Marie Arouet (Voltaire), to Eric Blair (George Orwell) there is a tradition of, in the words of preparing “a face to meet the faces that you meet” (to borrow a line from T.S Eliot) for a variety of reasons. See http://en.wikipedia.org/wiki/List_of_pen_names for more examples.

  • Some food for thought

    The Official Twitter Account of the Irish EU Presidency (@eu2013ie) tweeted earlier today about recipes.

    That gave me a little food for thought given the subject matter I posted on yesterday.

    1. Ireland will hold the Presidency of the EU in the first half of 2013.
    2. Part of what we will be tasked with is guiding the Data Protection Regulation through the final stages of ratification
    3. Viviane Reding has been very vocal about the role Ireland will play and the importance of strengthening enforcement of rights to Personal Data Privacy in the EU. 
    4. World wide media  and our European peers will be looking at Ireland and our approach to Data Protection.

    In that context I would hope that any Dáil Committee would have the importance of the right to Privacy (as enshrined in EU Treaties and manifested by our current Data Protection Acts and the forthcoming Data Protection Regulation) when reviewing legislation and regulation around Social Media.

    While I don’t think that the recipes being tweeted about by the @eu2013ie account contained any Chinese recipes, the news today about changes in the Chinese Social Media regulatory environment are disturbing in the context of the rights to privacy and free speech. One interesting point about China’s approach to control of on-line comment from the FT article linked to above is this:

    It has also tried to strengthen its grip on users with periodical pushes for real name registration. But so far, these attempts have been unsuccessful in confirming the identity of most of China’s more than 500m web users

    Food for thought.