Roll Up, Roll Up – see the amazing psychic dog! (minor update)

Roll up Roll Up, meet the new DPC!

Roll up Roll Up, meet the new DPC! (says Irish Times)

Every so often I read things in the newspaper that make me go “Yay!”. More frequently I read things that make me go “Boo!”. Today, as with other days, I read something that made me go “WHAT THE F….?!?!”.

Over the past few weeks the Irish Times has done a bang up job breaking some excellent stories about Data Protection issues in Ireland. Karlin Lillington, Elaine Edwards, and others have sought to “Tell the Story of Why” and push past the usual soundbites and bullshit gloss that usually passes for data-related journalism in Ireland.

One great example of this was the work done on a story about how the Dept of Arts Heritage and the Gaeltacht had erred in exposing data on living people (whose data privacy rights are protected under the Data Protection Acts and the Treaty for the Formation of the European Union, as well as the Irish Constitution – and if you want a potted guide to all of that Gerard Hogan gives a great summary here) on the website. This was despite having had consultation with the Office of the Data Protection Commissioner and having had guidance on what was and was not acceptable from a Data Protection perspective.

The various pieces written by Elaine Edwards were detailed, explained the core of the issues well, and generally added to the quality of discourse.

On the 23rd of July, in their Online edition, the Irish Times ran this piece of utter nonsense dressed up as journalism. It’s such a poorly researched and written piece that I can understand why the author felt it best to leave their name off the byline [update- unfair to author, it was a leader piece, but if so my comments below are even more relevant – /update].

It is true that the DPC raised issue regarding a property price register. The issue was that the sharing of data between different entities that would be required to create such a register, while of interest to the public, lacked a legislative basis and therefore risked breaching the Data Protection Acts. Legislation was passed two years ago that provided the “air cover” for the sharing of data to build a property register and lo and behold there is a property price register in place now, linked to the LPT process.

Comparisons between Irish law and UK law are often as valid as comparing an apple and orange, and complaining about the bitterness of the orange skin as you try to bite into it, on the basis that they are both fruit.

But the doozy in this article for me is the challenge to the DPC as to why they didn’t spot that the Dept of Arts Heritage and the Gaeltacht were in breach of the Data Protection Acts for a year. The anonymous author of this article asserts that the DPC’s job is to ensure compliance with the Data Protection Acts.

Actually no. That is not their job. To make the Regulator responsible for ensuring compliance breaches a number of concepts in Governance, such as segregation of duties.

Their job is to enforce the Act, to provide advice on how to not be non-compliant (which they did in this case), and investigate and prosecute offences under the legislation (albeit with a role in relation to education and awareness building as well).

The responsibility for ensuring compliance rests with the Data Controller doing the processing, in this case the Dept of Arts, Heritage and the Gaeltacht, who were non-compliant because they did the very thing they were told not to do by the DPC. Responsibility for ensuring compliance rests with the IT project team who developed interfaces that shared too much data, the testers who didn’t spot it, and the Data Controller in the Dept who didn’t double check that the business rules were followed.

The DPC’s job is to hold the Data Controller ACCOUNTABLE.

The bizarre logic of the writer of the article simply makes no sense. Are the Gardai responsible for ensuring compliance with the Road Traffic Acts? No. Their job is the detection of, investigation of, and prosecution of offences. Just like the DPC in this context – when the Office was made aware of a possible breach of the Acts, they investigated and took action immediately.  (Ensuring compliance with the Road Traffic Acts is the responsibility of the road user).

For all the sense that is in the article, the anonymous scribe [update-anonymous as it is a leader piece-/update] might as well have advocated that the soon to depart Mr Hawkes be replaced with a Psychic Dog who would detect all the potential future crimes, just like Tom Cruise in Minority Report.

Lazy, sloppy, and brain numbingly dumb hackery dressed up as journalism, an article of this low quality has no place in a paper of merit such as the Irish Times.

Good, informed, and informative journalism on Data protection issues must be encouraged however.

An anniversary post (of sorts)

A little under a year ago I wrote two posts on this blog regarding the Irish DPC, Facebook, and Safe Harbor.

The blog posts in question are here and here

Those posts were written under less than ideal conditions; sitting at train stations or in cramped train carriages, eyes streaming with hayfever (or perhaps I was weeping for the death of privacy.. sometimes it’s hard to tell),  typing furiously on an iphone, with limited access to internet, so were rattled off essentially off the top of my head at the time based solely on the information that was in the public domain.

The gist of what I wrote in those posts was as follows:

  1. The Data Protection Commissioner’s Office has to enforce the law that is in front of them.
  2. The law that is in front of them says that transfers to Facebook are OK under Safe Harbor
  3. To conduct an investigation would mean the DPC would have to challenge a decision of the European Commission (specifically the Safe Harbor decision).
  4. That was probably the reason why other Data Protection Authorities, while complaining about Facebook, PRISM, and Safe Harbor hadn’t actually done anything to suspend transfers, because they too were not able to directly challenge a decision of the European Commission.

In June we received the judgement of Hogan J. in Schrems vs DPC. This case was initiated as a judicial review of the decision of the DPC not to launch a full blown investigation in to Safe Harbor and Facebook.

In that judgement, Hogan J. held that:

  1. The DPC had correctly interpreted and enforced the law that was in front of them. Transfers from Facebook Ireland to Facebook US were permitted as a result of Safe Harbor.
  2. A question needed to go to the ECJ as to whether the DPC could actually ignore or look beyond the Commission Decision on Safe Harbor when looking at whether processing was lawful. (In essence this is a question that is asking the ECJ to rule on Safe Harbor in light of the changes in EU Data Protection law since it was implemented a decade and a half ago. Since then Data Privacy has become clearly recognised as a fundamental right and the Digital Rights Ireland case has clarified the need for proportionality in data processing, particularly on-line surveillance).

And with that he sent a question to the European Court of Justice that potentially will have echoes as profound as Gavrilo Princip’s revolver shot on a side street in Sarajevo a century ago.

It was particularly heartening to me to read paragraphs 80 and 81 of Hogan J.’s judgement when it came out. In those paragraphs he basically says exactly what I said a year ago: the EU Commission had decided that Safe Harbor was an appropriate mechanism for cross border data transfer and the DPC was tied t the findings of the Commission under the Irish Data Protection Acts and the underlying Directive. That’s pretty much what I said in this blog post.

I am loathe to engage in precognition on the ECJ case that we are presented with now. However, I will venture the following for now:

  1. This is no longer a case about an Austrian law postgrad taking on an administrative functionary in on the western spiral arm of the EU.
  2. This has become a case about information flows and fundamental rights (thanks in no small part by some deft adjudication by Hogan J).
  3. This has become a question of information society (the ethics, rights, rules, and benefits of information processing) versus information economy (individuals as units of production, and surveillance of the drones by Big Brother). It will have a profound impact no matter what the outcome.
  4. While Max Schrems has taken his case against the Irish Data Protection Commissioner, ultimately it is the Safe Harbor mechanism that is on trial now at the ECJ.
  5. If Safe Harbor is found to be not fit for purpose as a result of the disproportionate threats to data privacy rights of EU citizens, we will move into a very interesting era. If it turns out that national Data Protection Authorities can second guess decisions of the EU Commission when the surrounding laws or social environment changes, that will have ripples out far beyond the world of Data Protection law and practice.

The role of Digital Rights Ireland as amicus curae in this case is to be welcomed. They add no baggage to the wagon train, but having been to the ECJ already on a data protection issue they are familiar with the winding trail ahead.

It is to be hoped that politicians and functionaries in the civil services of Member States and the Commission, as well as the media and the general public, wake up to the issues here and start paying attention. In the absence of a global drive to establish functioning and balanced frameworks for effective cross border data transfer we may find ourselves with exactly the same problems that gave rise over three decades ago to the need for the OECD Guidelines , and in turn Council of Europe Convention 108 and the entire framework of EU Data Protection laws in the first place.

Interesting times indeed.

Arise DeskZilla

I use a standing desk when working in my office (and if I could find a light weight portable option I’d use one on client sites as well). Many of the greatest leaders have used standing desks.

There are proven medical benefits to getting off your backside when working. It’s worth bearing in mind that sitting for a living is an invention of the late 19th and early 20th century. Prior to that most people did have to move around a lot. But standing desks can be expensive. So a theme has developed over the past few years of hacking functional standing desks that are ergonomically aligned using a low capital investment model (for which we must read “it don’t cost much if you make a mess of it”). The source of raw materials is a certain Swedish home improvements store famed for their meatballs that I won’t name here because they are very protective of their brand name. But a good source of ideas for how to repurpose their stuff can be found here.

About 18 months ago, after a flare up of back trouble, I did a bit of research (using the hacks site linked to above and a few others) to see how I might best build a standing desk on a near-zero budget. I started with a few basic design principles:

  1. Aim for “minimum viable product” – it had to meet ergonomic requirements for me and my height, but I guessed it how I worked, laid out my work, and how the desk would need to function would evolve as I changed from sitting on my ass to moving around.
  2. Reuse or recycle things I already had – I had a desk already. I wasn’t going to junk that. I also had a pretty cool laptop stand with cooling fan and USB ports.
  3. Kaizen principles – I’d look to find ways to reduce waste of effort and time when working, and accept that the desk would not be perfect as I’d always find something else to improve how it works for me and with me.
  4. MacGyver rocks.


Some basics. If you don’t have your standing desk set up correctly you will simply make things worse for yourself. Do some research. Buy a measuring tape. Think about posture, stance and positioning. I train (as often as I can, which isn’t often enough) in Aikido so I am very concious of my centre point (hara) and the need to have hips and back aligned correctly for good movement and energy flow.

Some good resources for standing desk ergonomics I found during my research are here, here, and here. A recent resource that covers off some good “dos and don’ts” can be found here.

Introducing DeskZilla

DeskZilla was the result of my research and my design principles. It was built entirely from parts purchased from Ikea (oops I’ve named them), with a few extra bits thrown in to make minor adjustments.

Picture of deskzilla standing desk

First iteration of DeskZilla.

The parts I used were:

  1. A Vika/Amon desktop (no longer available). It is 100cm wide and 50 cm deep. For alternative table tops, see here: Ikea TableTops
  2. An Ekby Jarpen shelf for the monitor and laptop level, with three Ekby Tore clamp brackets (3 ensures shelf doesn’t bow in the middle). Ikea actually illustrate the use of the brackets on a desktop on their website now.
  3. Capita legs for the desk (which require a little MacGyvering with a drill to make some new screw holes for them as they are not meant as desk legs). I went for these as they could be adjusted up to 17cm high. Note that the Capita legs aren’t MASSIVELY extendy, they adjustable to compensate for uneven floors in the furniture they are supposed to be used on. But a centimetre or two can make all the difference.
  4. Two power blocks from Aldi that bolted onto the desk. I put them on the rear edge to stop DeskZilla from sliding backwards.

Total cost, a little over €70.

A key point… it is really important to measure your existing desk and the height/depth of each component to make sure things are going to be at the right height.

What I have with Deskzilla is a modular system where I can move the monitor and laptop down on to the lower level and move the keyboard and mouse down to the lower desk and use it as a sitting desk. The monitor is almost exactly perfectly positioned for a sitting desk when on the first level.

I had to add a pencil box under the keyboard to move it up a centimetre and a half or so for better ergonomics when typing. The monitor is now raised up on a hardback books to improve positioning (more on that in a moment).

Evolution, Phase 1

Almost immediately DeskZilla began to evolve. While the monitor was almost perfectly aligned, I found that video conferencing was a great way to double check.

Rule of thumb: if you have a webcam in your monitor your eyes should be in line with the lens. A hardback book fixed that.

After a few weeks of use I noticed I was getting stiffness. Some gym mats from Argos on the floor provides an anti-fatigue feature, and I still have my chair and can switch to sitting if I get too stiff and sore any day. The body is a bugger and some days you can stand without issue for hours (I pulled a 27 hour straight working day on a project last year… standing almost the entire time) and other days it hurts like heck after a few hours.

Second rule of thumb: listen to your body and adapt each day.

Evolution Phase 2

DeskZilla will evolve again soon. Experience with the monitor, and the hassle of bending to get pens, post-it notes etc,  tells me that it might make sense to swap the Ekby shelf for one with drawers that has the same length and a bit more height. The Ekby Alex shelf looks like a contender. The only reason I rejected it in Phase 1 was cost – it would have been over 50% additional on the budget.

I also need to think about raising the desktop a little to remove the need for a pencil case under the keyboard. That could be achieved through castor guards or something like that (the things that you put on furniture that is going on a wooden floor), another option is some half-inch wooden blocks  between the Ekby desktop and the Capita legs to give a small height boost. That last option would be a good call for anyone over 6ft 2″ who wanted to use this recipe, and could be a way to incrementally tweak the height to what you need rather than relying on just the leg extendibility.

Finally, I’ll probably invest in a folding bar stool type chair, or an ironing board chair to use when fatigue kicks in to take the weight off my ankles and knees.

Some key lessons about standing while working

  1. Think zen and do yoga. Simple stretching movements keeps fatigue at bay and helps strengthen core.
  2. Don’t stand still… move around and shift posture.
  3. Get used to working in shorter bursts and then changing position. I used to sit motionless for hours, now I work in 10 to 15 minute bursts and then switch posture or position… any longer and I stiffen up, which can hurt and break concentration any way. Movement keeps the brain awake!
  4. Two monitors makes a massive difference, but only if you aren’t having to crane your neck to see it.
  5. Your workspace will evolve around you. Find a natural movement and flow for you and settle into it. If you force it you’ll find it just doesn’t click for you.
  6. Breathe. Take advantage of your posture and position to take deep breaths and relax into your work.
  7. Each day you will need to improvise something to tweak a factor to improve comfort and flow. Accept that and get on with it
  8. The Desk is NEVER finished if you are building your own, (and it’s never perfect if you bought it off the shelf)

TV Licences, Data Protection, and the comments of the DPC

It was great to hear the Data Protection Commissioner on Newstalk this afternoon explaining the situation regarding the proposed TV License data slurp. I’ll post a link to the podcast when it is available.

A quick summary of key points that he made is as follows:

  1. The Government must pass legislation to allow for any access to data.
  2. The accessing of subscriber data is an interference with fundamental rights so, while Public Interest (e.g. maximising revenue from TV licence to keep Fair City on the air), the Government must convince the Oireachtas that the levels of access proposed are justified. The DPC specifically said that “the Oireachtas need to think about this”.
  3. He went on later to restate the importance of the Public Interest needing to out weigh and justify the interference in fundamental rights.
  4. He specifically flagged that whatever mechanism and process is proposed in legislation, it needs to be a “reasonable and proportionate measure”
  5. An Post should only have access to the minimum amount of information necessary to confirm if there is use of a TV service.

Hmmm.. I’ve heard comments like that somewhere else recently

A slight difference of opinion…

The DPC compared the access of data from TV service providers as being similar to the legislation that was brought in to establish the Property Register for the LPT tax.

I respectfully have to disagree a little on this. The LPT register required a completely new database to be created from scratch for the purposes of effectively, efficiently, and fairly levying a new tax. Data was drawn from multiple State and private sector data sets to create the best possible register for that purpose [disclosure: my company was involved in some preliminary work around the establishment of the LPT Register].

What is proposed in the case of the TV licence is to supplement an existing private sector database (An Post’s) with data from potential competitors for the purpose of detecting non-compliance with an existing tax/levy. It is a subtle difference and should affect the determination of what is proportionate. There is already an investigation and detection function for TV licence enforcement. Any level of access other than on a case by case basis for the investigation of and prosecution of non-payment would require a clear justification in my view to pass a proportionality test. Rather than comparing to the LPT establishing something new, a more appropriate comparison would be to existing Revenue powers to request data from banks in the course of an investigation, not as a general blanket bulk extraction.

The Thin End of the Wedge

The DPC is “concious of making sure that this won’t be the thin end of the wedge”. In that case attention needs to be paid to how the legislation evolves. As I pointed out yesterday, Sky and UPC are both also providers of telecommunications services. In defining what data is being accessed for what purpose, it needs to be clarified if this legislative data grab will be constrained just to television service packages or to a wider range of product offerings. And within that there then needs to be consideration as to how An Post would verify that a broadband subscriber was or was not using their service to stream TV to a laptop or handheld device, a scenario that is currently not covered by the TV licence, but is proposed to form part of a Household Broadcasting Charge in the not too distant future.

This is where there is another key difference between this proposed legislation and the LPT. The LPT legislation, from the very beginning, made clear that data would be obtained from private sector organisations to enrich and validate data on the Register obtained from existing State sources. While some thought that it was the tightening of Big Brother’s grubby mitts around our data, it was at least an open and transparent initiative.

If the intent here is to build a Household Broadcasting Charge Register by enriching the existing An Post data sets with 3rd party data, then the Minister and Department should come out and state that and place the Public Interest question around this proposed legislation on a more transparent footing, which in turn may affect the consideration of what form of mechanisms and measures would be reasonable and proportionate to achieve that end. That will ensure that the legislation that the Oireachtas may eventually pass will be fit-for-purpose, that the correct balance of rights between the individual, the organisation, and the State will be considered, and there can be a proper debate and provision of information about what constitutes a “reasonable and proportionate measure” in that context.

If the data is required to support existing investigation and detection processes for the current TV licence, I would suggest that what is reasonable and proportionate is more in line with Revenue’s powers of access to bank records on a case by case basis then the mass integration of data required to create the infrastructure for an entirely new tax head, and it is on that basis that the assessment of “reasonable and proportionate” should be made.

The de minimis principle

The DPC was clear that only the minimum necessary amount of information for the specific purpose could or should be shared. Hear hear!

Of course, his comment presumes a bulk sharing obligation is required or is proportionate. As I wrote yesterday, and as I mention above, if the proportionate response is to improve evidence gathering in investigation of suspected non-payment of a licence fee then An Post (or any other collecting agency) could simply ask, on a case by case basis, “Does X address have a television service” and receive a simple yes or no response.

The Commissioner’s comments don’t rule that approach out however.

Of course, de minimis is a principle that applies to the purpose and intent of the processing. If the intent or purpose is to ensure that everyone who has a Sky or UPC subscription has paid their TV licence, it would be quicker, easier, and cheaper, to make them collecting authorities for their customers and leave An Post with the rump, with the Department managing a reconciliation process on an annual basis. It would add €13 or so to a Sky TV subscription, and it would ensure that every location where a single customer had a Sky TV box installed was paying the fee.

The Prickly Problem of Proportionality

It is good to see the DPC making positive comments about how the Oireachtas needs to reflect on how any legislation that might emerge would impact on fundamental rights. The Government must convince the Oireachtas (but with a majority, that is a fudge), but the Oireachtas has to act in accordance with the Constitution and with our obligations under EU Treaties. The ECJ has ruled on the Data Retention Directive and has made it clear that for serious offences that the interference in data privacy rights through retention of or bulk access to communications data must be proportionate. Digital Rights Ireland have yet to return to the High Court for the next round of their challenge to the Communications Retention of Data Act 2011, but it defines a “serious offence” as being one carrying a prison sentence of at least 5 years.

For a €160 licence fee and a summary offence with a €1000 fine on first offence or €2000 on subsequent offences (people go to jail for non-payment of fine, not non-payment of TV licence) it will be interesting to see how proportionality will be established.

It may be that the Government will need to consider alternative mechanisms for enforcement of the TV Licence (or future Broadcasting Charge) that does not require the sharing of data. The key objective, after all, is to maximise the cash inflow for the State to support development of indigenous broadcasting while at the same time minimising enforcement costs and minimising the extent to which data is being shared and processed between private sector organisations, albeit on behalf of the State.

Of course, any reliance on full and frank debate in the Oireachtas has to recognise that the Government has a majority and we operate a whip system in our parliament. Government TDs will vote with the Government line. Which means that legislation might get passed that is actually a disproportionate response to the problem. Gerard Cunningham (@faduda) kindly reminded me of this on twitter.

Ultimately, the Minister needs to be clear in his Problem Statement before rushing to a solution, and the Oireachtas needs to think outside the box when assessing the reasonableness and proportionality of the legislative response to the realities of the telecommunications and broadcasting markets.

TV Licence checks and “Data Protection Principles” [updated]

This morning’s Irish Times reports this morning that the (current) Irish Communications Minister  is seeking cabinet approval for powers to enable the agency that collects TV Licences (currently An Post, the Irish post office) to access subscriber koi data from subscription TV providers such as Sky or UPC to crack down on TV licence evasion. We are assured by the Minister that the whole thing will be done “ in accordance with strict data protection guidelines”. Ignoring for a moment that “Data Protection” is not a guideline but is a fundamental right of EU citizens enshrined in law and derived from both the TFEU and the European Charter on Fundamental Rights and implemented in Irish law as a result of an EU Directive (ergo… not a guideline but kind of a big thing to keep an eye on), what might those guidelines be?

[Update] are reporting that this proposal has passed the Cabinet. The mechanism that is to be applied is reported as being:

“An Post will be allowed access the subscription data held by the likes of UPC and Sky to cross-reference their subscriber databases with its own data on TV licence fee payers”

I address the implications of this below in an update paragraph inserted in the original text. [/update]


In general Data Protection terms, once there is a statutory basis for processing (and access to data is processing) then the processing is lawful. What appears to be being proposed here is legislation that will allow subscriber data of one group of companies to be accessed by another company for the purposes of checking if someone is getting moving pictures on a telly box or similar device. So that’s the box ticked and we can move on, right? Oh, so long as we have protocols around the how, when, and why of access to the data right (because they are always followed)? And of course, the legislation will prevent scope creep in terms of  the use of the data and the potential sources of data that might be accessed using the legislation (e.g. telecommunications service providers who might have broadband going into a home or onto a device). Well, since April (and thanks to the great work of Digital Rights Ireland) we actually have some guidance from the Court of Justice of the European Union.

This is guidance that Minister Rabbitte’s department should be distinctly aware of as it affected legislation that they are responsible for, the Communications Data Retention Directive (from which the Irish Communications Data Retention Act got its authority). In that case, the ECJ was very clear: any processing of personal data needs to be a proportionate for the outcome required. In the Digital Rights Ireland case, the ECJ felt that requiring the retention of call traffic and internet usage data on the off chance it might be useful to authorities to counter terrorism was a disproportionate response. Access to specific data would not be disproportionate, but wholesale data slurping was a breach of fundamental rights to data privacy as enshrined in the EU Charter of Fundamental Rights. This reasoning was followed by Hogan J in the recent case of Schrems vs The Data Protection Commissioner in the High Court where Hogan deftly summarises the constitutional, statutory, and EU Treaty bases for Data Privacy rights in Ireland and the EU.

The upshot is that, regardless of the existence of a statutory authority to do a particular piece of processing, the processing itself must be a proportionate invasion of an individual’s right to Personal Data Privacy and their right to Privacy – two distinctly separate rights now under EU law. So, what would be a proportionate response in this context? How big is the problem?

The Proportionality Conundrum

According to the Minister, 16% of households don’t pay for a TV licence. According to ComReg 73% of households receive TV services via a subscription service. So 27% of people don’t pay for a TV service subscription and 16% don’t have a TV license, so there are more people who don’t have a paid TV subscription then don’t have a TV license? It is not outside the bounds of possibility that the ENTIRETY of the 16% that the Minister seeks to pursue are contained in the 27% that Sky and UPC would also love to separate from their subscriptions. Perhaps these people don’t have a television at all?

Even assuming that the two groups are unrelated, the question of whether allowing An Post access to the subscriber lists of UPC and Sky is a proportionate response. It’s not. If it is not a proportionate response for serious offences under the now defunct Data Retention Directive to allow law enforcement blanket access to telecommunications call history and internet usage data, it is probably not proportionate for a private company to have access to the subscriber lists of potential competitors (who knows what An Post might want to pivot into, given they are in the telecommunications business ) for the purposes of detecting where people don’t have a TV license.

[Update] Based on a report on, it appears that what is proposed is an en masse cross checking of data between An Post’s TV License database and the databases of Sky and UPC.  This is borders, in effect, on a form of mass surveillance. It is, in my opinion, that this would be unlikely to be seen as a proportionate response to the problem. This is particularly the case where alternatives to the bulk access to data can achieve the same overall objective without the need for the data to be processed in this way. [/update]

What would be proportionate would be for An Post to be able to make a request, on a case by case basis, for confirmation if a property which does not have a TV license is in receipt of a subscription TV service, once there was a detection that there was someone resident at the address or a business operating at the address which had a receiving device (i.e. a TV). Sky or UPC would simply need to respond with a “Yes they have service” or “No they do not” with no other data being accessed.

A wrinkle though…

One wrinkle is that Sky and UPC are not just TV service companies. They are telecommunications service providers as well. They provide home phone and broadband services. So the scope of the potential legislation is to allow a telecommunications company (An Post) access to the subscriber data of other telecommunications companies. This raises significant issues from a Data Protection perspective under SI336 ,where telecommunications providers have very serious security obligations to their subscribers around notifying of potential security issues on their network and also notifying subscribers and the Data Protection Commissioner where there has been a breach of data security.

It also raises the spectre of other telecommunications companies being required to provide the same data, depending on how the legislation is drafted.

Almost inevitably, the telecommunications providers would be asked to provide data to An Post about users who were accessing particular types of services or IP addresses (e.g. RTE online services or TV3 Player, or Netflix, or similar). This is EXACTLY the type of data that the ECJ has ruled on in the Digital Rights Ireland case. Proportionality raises its head again, along with the need to avoid information security breaches on the part of the telecommunications companies being asked to provide access to their data.

The Upshot

At this remove I can identify a few mechanisms that would be a proportionate interference in personal data privacy rights, and would minimise the risks of unauthorised access to or disclosure of subscriber data by a telecommunications service provider.

  1. An Post would need to make their requests as part of an investigation of a specific instance of an offence with a view to prosecution. Each request would need to relate to the investigation of a specific offence (“Mr X, at address Y, has no TV license but has a receiving apparatus he claims is not connected to any service, please verify he is not a subscriber”). The subscription TV service providers or Telecommunications service providers would simply respond back with a “Yes” or “No” to the specific question. But that answer may not confirm if they use their broadband to access streamed broadcast services. It is very easy to mask internet usage by using VPN tunnelling services, so the net may not catch all the fishes the Minister is trawling for.
  2. Another option would be to simply add the cost of the TV license to the subscription fee for Sky or UPC television services and, potentially, to the cost of broadband services in the State.  This would require zero sharing of data and a single annual transaction between the service providers and the State. It would also avoid entirely the risk of unauthorised access to or disclosure of subscriber data as a result of An Post (or any other entity) having access to subscriber data.

(Of course, just because you have a broadband connection doesn’t mean you are watching TV programmes on your device. I have a good friend who has a very large computer monitor and watches DVDs streamed from a laptop. They have broadband. For email, internet access, and work stuff. Their TV and movie viewing is entirely DVD boxed set driven.  A mechanism would be required for people in that category to opt-out, unless this is a flat-rate tax on telecommunications services flying under a false flag. That is a matter for a different blog post.)

What ever approach is ultimately taken it will need to constitute an invasion of data privacy that is proportionate to the problem that presents itself. THAT is the Data Protection requirement that must be met. It is not a guideline. It is the law, and it is a matter of fundamental rights.

For the Minister to view Data Protection as a “guideline” further evidences the horridly discordant tone at the top in the Irish State about Data Protection (which I’ve written about here and here and here and here).


So, within hours of me blogging about data protection consent issues in the Facebook mood manipulation study, the Register has the EXCLUSIVE that Facebook is being investigated by the irish DPC with specific questions around the consent relied upon.

I’m not saying anyone in an office above a Centra in Portarlington reads this blog but it is a serendipitous co-incidence.

And it may turn out that manipulating user timelines to provoke emotional responses could make Facebook management very sad.

Facebook, Manipulation, and Data Protection – part 2

Right. Having gotten some day job work out of the way I return to this topic to tease out the issues further.

One aspect that I didn’t touch on in the last post was whether or not Data Protection exemptions exist for research and if those exemptions apply in this case. This discussion starts from the premise that EU Data Protection law applies to this Facebook research and that Irish Data Protection law is the relevant legislation.

The Exemption

Section 2(5) of the Data Protection Acts 1988 and 2003 provides an exemption for processing for research purposes:

(a) “do not apply to personal data kept for statistical or research or other scientific purposes, and the keeping of which complies with such requirements (if any) as may be prescribed for the purpose of safeguarding the fundamental rights and freedoms of data subjects.


(b) “the data or, as the case may be, the information constituting such data shall not be regarded for the purposes of paragraph (a) of the said subsection as having been obtained unfairly by reason only that its use for any such purpose was not disclosed when it was obtained, if the data are not used in such a way that damage or distress is, or is likely to be, caused to any data subject

The key elements of the test therefore are:

  1. The data is being processed for statistical or scientific purposes
  2. And the processing of the data complies with requirements that might be prescribed for safeguarding fundamental rights and freedoms

This means that for research which is being undertaken for scientific purposes with an appropriate ethics review that has identified appropriate controls to safeguard fundamental rights of Data Subjects, which since the enactment of the Charter of Fundamental Rights in the EU includes a distinct right to personal data privacy. This was reaffirmed by the Digital Rights Ireland case earlier this year.

The question arises: was the Facebook study as scientific purpose? It would appear to be so, and in that context we need to examine if there was any processing requirements set out to safeguard fundamental rights and freedoms of Data Subjects. That is a function of the IRB or Ethics committee overseeing the research. Cornell University are clear that the issues of personal data processing were not considered in this case as their scientists were engaged in a review and analysis of processed data and they did not believe that there was human research being undertaken.

Whether or not you consider that line of argument to be Jesuitical bullshit or not is secondary to the simple fact that no specific requirements were set out from any entity regarding the controls that needed to be put in place to protect the fundamental rights and freedoms (such as freedom of expression) that the Data Subject should enjoy.

Legally this means that the two stage test is passed.  Data is being processed for a scientific purpose and there has been no breach of any provision set down for the processing of the data to safeguard fundamental rights, so consent etc. is not required to justify the processing and the standard around fair obtaining is looser.

Apparently if your review doesn’t consider your research to be human research then you are in the clear.

Ethically that should be problematic as it suggests that careful parsing of the roles of different participants in research activity can bypass the need to check if you have safeguarded the fundamental rights of your research subjects. That is why ethics reviews are important, and especially so when it comes to the ethics of “Big Data” research. Rather than assessing if a particular research project is human research we should be asking how it isn’t, particularly when the source of the data is identifiable social media profiles.

A Key Third test…

The third part of the test is whether or not the data is being used in a way that would cause damage or distress to the data subject. This is a key test in the context of the Facebook project and the design of the study. Consent and fair obtaining requirements can be waived where there is no likelihood of damage or distress being caused to the research subject.

However, this study specifically set out to create test conditions that would cause distress to data subjects.

It may be argued that the test is actually whether or not the distress would be measured as an additional level of distress that would be caused over and above the normal level of distress that the subject might suffer. But given that the Facebook study was creating specific instances of distress to measure a causation/correlation relationship between status updates and emotional responses, it’s hard to see how this element of the exemption would actually apply.

Had Facebook adopted a passive approach to monitoring and classifying the data rather than a directed approach then their processing would not have caused distress (it would have just monitored and reported on it).

The Upshot?

It looks like Facebook/Cornell might get off on a technicality under the first two stages of the test. They were conducting scientific research and there was no prerequisite from any Ethics committee to have any controls to protect fundamental rights. However that is simply a technicality and it could be argued that, in the absence of a positive decision that no controls were needed, it may not be sufficient to rely on that to avail of the Section 2(5) exemption.

However, it may be that the direct nature of the manipulation and the fact that it was intended to cause distress to members of the sample population might negate the ability to rely on this exemption in the first place, which means that consent and all the other requirements of the Data Protection Acts should apply and be considered in the conduct of the research.

The only saving grace might be that the level of distress detected was not found to be statistically large. But to find that they had to conduct the questionable research in the first place.

And that brings us back to the “wibbly-wobbly, timey-wimey” issues with the consent relied upon in the published paper.

Ultimately it highlights the needs for a proactive approach to ethics and data privacy rights in Big Data research activities. Rather than assuming that the data is not human data or identifiable data, Ethics committees should be invoked and required to assess whether the data is and ensure that appropriate controls are defined to protect fundamental rights. Finally, the question of whether distress will be caused to data subjects in the course of data gathering needs to be a key ethical question as it can trigger Data Protection liability in otherwise valuable research activities.

Facebook Research, Timeline Manipulation, & EU Data Protection Law

This is an initial post based on the information I have to hand today (1st July 2014). I’ve written it because I’ve had a number of queries this morning about the Data Protection implications of Facebook’s research activity. I’m writing it here and not on my company’s website because it is a work in progress and is my personal view. I may be wrong on some or all of these questions.

Question 1: Can (or should) the Data Protection Commissioner in Ireland get involved?

Facebook operates worldwide. However, for Facebook users outside the US and Canada, the Data Controller is Facebook Ireland, based in Dublin. Therefore EU Data Protection laws, in the form of the Irish Data Protection Acts 1988 and 2003 applies to the processing of personal data by Facebook. As a result, the Irish Data Protection Commissioner is the relevant regulator for all Facebook users outside the US and Canada. The key question then is whether or not Facebook constrained their research population to data subjects (users) within the US and Canada.

  • If yes, then this is not a matter for investigation by EU data protection authorities (i.e. the Data Protection Commissioner).
  • If no, then the Irish Data Protection Commissioner and EU Data Protection laws come into play.

If Facebook didn’t constrain their population set, it is therefore possible for Facebook users outside of the US and Canada to make a complaint to the DPC about the processing and to have it investigated. However, the DPC does not have to wait for a complaint. Section 10 of the Data Protection Acts empowers the Commissioner to undertake “such investigations as he or she considers appropriate” to ensure compliance with legislation and to “identify any contravention” of the Data Protection Acts 1988 and 2003.

[update] So, it is clear that the data was obtained from a random sample of facebook users. Which raises the question of the sampling method used – was it stratified random sampling (randomised within a sub-set of the total user base) or random sampling across the entire user base? If the former then the data might have been constrained. If the latter, the data inevitably will contain data subjects from outside the US/Canada region. [/update]

Answer: If Facebook hasn’t constrained their population to just North America (US/Canada) then… Yes.

Question 2: If Irish/EU Data Protection Law applies, has Facebook done anything wrong?

Tricky question, and I wouldn’t want to prejudge any possible investigation by the Data Protection Commissioner (assuming the answer to Question 1 would get them involved).  However, based on the information that is available a number of potential issues arise, most of them centred on the question of consent. Consent is a tricky issue in academic research, market research, or clinical research. The study which was conducted related to the psychological state of data subjects. That is categorised as “Sensitive Personal Data” under the Data Protection Acts. As such, the processing of that data requires explicit consent under Section 2B of the Acts. Beyond the scope of the Data Protection Acts, clinical research is governed by ethical standards such as the Nuremburg Code which also requires a focus on voluntary and informed consent:

The voluntary consent of the human subject is absolutely essential… and should have sufficient knowledge and comprehension of the elements of the subject matter involved as to enable him to make an understanding and enlightened decision. This latter element requires that before the acceptance of an affirmative decision by the experimental subject there should be made known to him the nature, duration, and purpose of the experiment

Question 2A: Was Consent Required? Consent is required for processing of sensitive personal data. For that data to be sensitive personal data it needs to be data that is identifiable to an individual and is sensitive in nature. However, if the data being processed was anonymised or pseudonymised then it falls outside the scope of personal data, assuming appropriate controls are in place to prevent re-identification. The Irish Data Protection Commissioner has published guidance in 2007 on Clinical Research in the Healthcare sector which provides some guidance on the question of consent, albeit from the perspective of a pure clinical healthcare perspective. A key point in the guidance is that while anonymising data may remove the Data Protection question around consent, it doesn’t preclude the ethical questions around conducting research using patient data. These kind of questions are the domain of Ethics Committees in Universities or commercial research organisations. Research of this kind are governed by Institutional Review Boards (IRB) (aka Ethics Committees).

Apparently Cornell University took the view that, as their researchers were not actually looking at the original raw data and were basing their analysis of results produced by the Facebook Data Science team they were not conducting human research and as such the question of whether consent was required for the research wasn’t considered. The specifics of the US rules and regulations on research ethics are too detailed for me to go into here. There is a great post on the topic here which concludes that, in a given set of circumstances, it is possible that an IRB might have been able to approve the research as it was conducted given that Facebook manipulates timelines and algorithms all the time. However, the article concludes that some level of information about the research, over and above the blanket “research” term contained in Facebook’s Data Use policy would likely have been required (but not to the level of biasing the study by putting all cards on the table), and it would have been preferable if the subjects had received a debrief from Facebook rather than the entire user population wondering if it was them who had been manipulated. Interestingly, the authors of the paper point to Facebook’s Data Use Policy as the basis of their “informed consent” for this study:

As such, it was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research.

Answer: This is a tricky one. For the analysis of aggregate data no consent is required under DP laws and, it appears, it raises no ethical issues. However, the fact that the researchers felt they needed to clarify that they had consent under Facebook’s Data Use policy to conduct the data gathering experiments suggests that they felt they needed to have consent for the specific experimentation they were undertaking, notwithstanding that they might have been able to clear ethical hurdles over the use of the data once it had been obtained legally.

Question 2b: If consent exists, is it valid? The only problem with the assertion by the researchers that the research was governed by Facebook’s Data Use policy is that, at the time of the study (January 2012) there was no such specified purpose in Facebook’s Data use policy. This has been highlighted by Forbes writer Kashmir Hill.

The text covering research purposes was added in May 2012. It may well have been a proposed change that was working its way through internal reviews within Facebook, but it is impossible for someone to give informed consent for a purpose about which they have not been informed. Therefore, if Facebook are relying on a term in their Data Use Policy which hadn’t been introduced at the time of the study, then there is no valid consent in place, even if we can assume that implied consent would be sufficient for the purposes of conducting psychological research. If we enter into a degree of speculation and assume that, through some wibbly-wobbly timey-wimey construct (or Kashmir Hill having made an unlikely error in her analysis), there was a single word in the Data Use Policy for Facebook that permitted “research”, is that sufficient?

For consent to be valid it must be specific, informed, unambiguous, and freely given. I would argue that “research” is too broad a term and could be interpreted as meaning just internal research about service functionality and operations, particularly in the context in which it appears in the Facebook Data Use Policy where it is lumped in as part of “internal operations”. Is publishing psychological and sociological research part of Facebook’s “internal operations”? Is it part of Facebook’s “internal operations” to try to make their users sad? Interestingly, a review of the Irish Data Protection Commissioner’s Audit of Facebook in 2012 reveals no mention of “Research” as a stated purpose for Facebook to be processing personal data. There is a lot of information about how the Facebook Ireland User Operations team process data such as help-desk queries etc. But there is nothing about conducting psychometric analysis of users through manipulation of their timelines. Perhaps the question was not asked by the DPC?

So, it could be argued by a Data Protection regulator (or an aggrieved research subject) that the consent was insufficiently specific or unambiguous to be valid. And, lest we forget it, processing of data relating to Sensitive personal data such as psychological health, philosophical opinions etc. requires explicit consent under EU law. The direct manipulation of a data subject’s news feed to test if it made them happier or sadder or had no effect might therefore require a higher level of disclosure and a more positive and direct confirmation/affirmation of consent other than “they read the document and used the service”. There are other reasons people would use Facebook other than to be residents of a petri dish.

Does this type of research differ from A/B testing in user interface design or copywriting? Arguably no, as it is a tweak to a thing to see if people respond differently. However A/B testing isn’t looking for a profound correlation over a long term between changes to content and how a person feels. A/B testing is simply asking, at a point in time, whether someone liked presentation A of content versus presentation B. It is more functionally driven market research than psychological or sociological analysis.

Answer: I’d have to come down on the negative here. If consent to the processing of personal data in the manner described was required, it is difficult for me to see how it could be validly given, particularly as the requirement is for EXPLICIT consent. On one hand it appears that the magic words being relied up on by the researchers didn’t exist at the time of the research being conducted. Therefore there can be no consent. Assuming some form of fudged retroactivity of consents given to cover processing in the past, it is still difficult to see how “research” for “internal operations” purposes meets the requirement  of explicit consent necessary for psychological research of this kind. It differs to user experience testing which is more “market research” than psychological and therefore is arguably subject to a higher standard.

Question 3: Could it have been done differently to avoid Data Protection Risks

Short answer: yes. A number of things could have been done differently.

  1. Notification of inclusion in a research study to assess user behaviours, with an option to opt-out, would have provided clarity on consent.
  2. Analysis of anonymised data sets without directed manipulation of specific users timelines would not have raised any DP issues.
  3. Ensure validity of consent. Make sure the text includes references to academic research activities and the potential psychological analysis of user responses to changes in Facebook environment. Such text should be clearly highlighted and, ideally, the consent to that element should be by a positive act to either opt-in (preferred) or to opt-out
  4. Anonymise data sets during study.
  5. Restrict population for study to US/Canada only – removes EU Data Protection issues entirely (but is potentially a cynical move).

Long Answer: It will depend on whether there is any specific finding by a Data Protection Authority against Facebook on this. It does, however, highlight the importance of considering Data Protection compliance concerns as well as ethical issues when designing studies, particularly in the context of Big Data. There have been comparisons between this kind of study and other sociological research such as researchers walking up to random test subjects and asking them to make a decision subject to a particular test condition. Such comparisons have merit, but only if we break them down to assess what is happening. With those studies there is a test subject who is anonymous, about whom data is recorded for research purposes, often in response to a manipulated stimulus to create a test condition. The volume of test subjects will be low. The potential impact will be low. And the opportunity to decline to participate exists (the test subject can walk on by… as I often did when faced with undergrad psychology students in University) With “Big Data” research, the subject is not anonymous, even if they can be anonymised. The volume of test subjects is high. Significantly (particularly in this case) there is no opportunity to decline to participate. By being a participant in the petri-dish system you are part of the experiment without your knowledge. I could choose to go to the University coffee shop without choosing to be surveyed and prodded by trainee brain monkeys. I appear to have no such choice with Data Scientists. The longer answer is that a proper consideration of the ethics and legal positioning of this kind of research is important.

Examples of poor Data Protection Practice in Public Sector

Earlier this week the Data Protection Commissioner bemoaned the lack of attention to detail and the poor culture of Data Protection compliance practices in the Irish Public Service.

He was right to do so. My experience as both a service user and as a consultant has been that there are excellent people making excellent efforts to swim against a tide of indifference and short-cutting that results in breaches of legislation and puts personal data of citizens at risk.

In a “brain fart” moment yesterday I googled the words “Election”, “Training” and “Ireland” by accident. It brought back a website called This website announces itself to be the “Official Presiding Officer Online Training “. Apparently Presiding Officers in this year’s Local and European Elections are required to complete this training, which I understand consists of a series of videos. It’s actually a rather good idea.

However it has been badly implemented from a Data Protection perspective.

  1. It requires a PPS Number to login. This is not a permitted use of the PPS Number. For a start, ElectionTrainingIreland is not registered as a Registered User of the PPSN under the 2005 Social Welfare Consolidation Act.
  2. Using PPS Numbers as a login is not good information security practice.
  3. As I understand it, Presiding Officers receive a letter that contains their PPS Number and a password for this site – which suggests that passwords are stored somewhere in an unencrypted freetext format (again BAD Information Security practice)
  4. There is no information about who Election Training Ireland are. They are NOT an official state body or division of the Department of the Environment. There is no privacy statement on the website that explains the purposes of processing of data, the retention of data, or (for that matter) where Election Training Ireland got the PPSN that they are using in the background to verify your identity.
  5. The website, which asks you to key in your PPS Number, does not have a valid SSL certificate. There is no encrypted transfer of information. Given the value of the PPS Number, that’s simply not good enough from a Data Protection point of view.

Looking at the process from the outside, armed only with nearly two decades of experience in designing and reviewing information management processes for regulatory compliance, I suspect that this might be the underlying process:

  1. A list of all people who registered to be Presiding officers was provided to Election Training Ireland. This list included PPS Numbers, names, and home addresses. [Issue #1 below]
  2. This list was used to create a database of Presiding Officers which in turn was used to create a list of user logins for the website. These user logins used PPSN as the user id [issue #2 below]
  3. This list was used to generate a mailmerge to each Presiding Officer at the address provided by them for contact (which is almost inevitably a home address) which contained their password [Issue #3 below]
  4. The website is not encrypted. [Issue #4 below]
  5. This list was provided to and processed by Election Training Ireland, who are an external contractor working (one assumes) for the Department of the Environment [See: “Who are ETI?” below]

Issue #1: Transfer of data about candidate Presiding Officers

Data constituting a significant portion of what is defined in the 2005 Social Welfare Consolidation Act as the “Public Service Identity” has been transferred to a 3rd party by local authorities and/or the Dept of Environment. What was the lawful basis for this transfer? Was there a statutory basis (which is the DPC’s favoured basis for data transfers in the Public Sector)? What was the protocols regarding security of transfer, retention, security of storage, restrictions on use etc? Is there a Data Processor contract in place (if there is it will be a doozy IMHO because of questions under “Who is ETI” below)?

As ETI is not registered as a User of the PPSN with the Department of Social Protection, issues potentially arise with the legality of transfer here. And even assuming that ETI has a valid contract etc. with either EVERY local authority or the Dept of Environment, the PPS numbers would have been obtained originally from Presiding Officers for the purposes of processing their payments and making appropriate deductions for taxation and PRSI etc. Not for using them as a unique identifier in a system hosted by a 3rd party.

Issue #2: Creation of lists and user logins

As mentioned above, the creation of a central database of presiding officers and the use of their PPS Number as an identifier in that database constitutes a new purpose within the context of the Data Protection Acts. Using PPS Number as a login is just dumb (a proxy could easily have been created). This database has PPS Numbers, names, and addresses of Presiding Officers. Where is it being stored? Under what security protocols? Who has access to it? How long will it be retained for? (Please don’t let them have saved it to Google Docs, otherwise I’ll have to get cross).

Issue #3 Mail merge and posting out passwords

Passwords are stored in plaintext if they could be mailed out in a mail merge. Being able to do a mail merge means that who ever sent the letters to Presiding Officers has their PPS Number, name, and addresss. That’s a heck of a lot of personal data. And if they are not thinking of the implications of storing passwords in an encrypted for and not sending them out in unsecured plain text, what’s the level of confidence in back-end security and security of related data transfers?

Issue #4 No SSL on the site

Using PPSN as a login is not great. Doing it in a way that could result in the data being intercepted by anyone minded to do so compounds the issue. Some might say it’s overkill for “just a login”, but the PPS Number is a significant identifier for people using public services in Ireland.

Who are ETI?

The site is not owned or operated by any Local Authority or Government Department. It is owned and operated by a partnership of three people based in Co. Meath. It took me 90 seconds to find that information on the CRO website, a basic due diligence test. If they are a partnership, each individual member of that partnership is a Data Processor acting on behalf of the Data Controller that engaged them (which might wind up being EVERY local authority or the Dept of the Environment – that is still unclear to me). There is nothing on the website identifying that the holder of the data doing this processing is not a government body.

So a private sector organization has been given a chunk of the Public Service identifier for a defined population of people, has implemented a solution that is arguably illegal in its design and is certainly not good information security practice. There is a questionable lawful basis for the transfer of data to this 3rd party. (I haven’t looked for the tender for this training solution, I’m assuming it went to tender and there was some specification of information security and data protection standards in that document. But I’ve got a day job to do).

What could be done better/differently?

Lots of stuff.

  1. Use a proxy identifier. If the data controller holding the PPSN had created a proxy identifier (an alphanumeric string that was NOT the PPSN) and provided that to ETI to use as a login the PPSN issue would not arise.
  2. Ensure proper contracts in place with the data processor
  3. Use SSL by default.
  4. Use an encrypted (salted and hashed) password that could be generated from a link that a user could follow that would bring them to a page where they set their own password, rather than having a plaintext password sent in the post.
  5. Improve transparency and communication about who the actors are in the process and their role.

That’ just my first four. Depending on the detail of the processes etc. there could be a LOT more. But none of them would cost a heck of lot of money and would result in a more compliant and less insecure processing of personal data.

The Strife of Reilly (Tone at the Top revisited)

Scanning twitter over my post-breakfast intra-planning pre-work coffee this morning I noticed tweets that were agog at a Minister for Health who is a medical doctor asking non-medical doctor political colleagues for lists of people who should have been given a medical card. The agogness was triggered by this news story on the RTE website.

Yes. It is a cause for agogness.

However my gog was a’d by one line in the middle of that story that actually links into a story covered (briefly) by Irish media yesterday. Minister Reilly has also asked for a list of names of people who have given information to the Primary Care Reimbursement Service who have had their information “misplaced”.

Only yesterday the Data Protection Commissioner was scathing in his comments about the level of “sloppiness” around the handling of personal and sensitive personal data in the Public Sector.

Today, buried in a story that was likely sourced from the Office of the Minister for Health himself, we find a disclosure that sensitive personal data and potentially personal financial data have been “misplaced” by a unit of the HSE.

However, the Minister is asking his colleagues for the names of people who might be affected. So that’s OK then.

No. It’s not.

If the PCRS has “misplaced” information that was provided to them in either electronic or hard copy form this constitutes a breach of Section 2(d) of the Data Protection Acts 1988 and 2003. Under the Voluntary Code of Practice for Data Security Breach Notification, the HSE is required to notify the Data Protection Commissioner where there is a risk that Personal Data, Sensitive Personal Data, or Personal Financial Data have been lost, or accessed or disclosed without authorization. The affected Data Subjects are supposed to be notified (unless it affects less than 100 people and doesn’t relate to Sensitive Personal Data or Personal Financial Data). The HSE, as Data Controller, is required to maintain a Data Breach Register for any reported incidents where the security of personal data has been put at risk. If the Minister is having to effectively do a ring around his mates in the Dáil to find out what the scale of the problem is, that should be a bit of a worry.

So. Riddle me this…

  1. Why is the Minister asking for a list of names of people whose data has been “misplaced”?
  2. Why is he asking for this list if the HSE PCRS has been maintaining a Register of incidents of reported loss of data?
  3. Why has the Minister not referred the issue to the Data Protection Commissioner?

The answer is, as ever, the Tone at the Top in the Public Service in Ireland. Unerringly it is a discordant “BLAAARRRRRRPPPPP” when it comes to matters of Data Protection. Organisation restructurings are undertaken without consideration for effective Data Governance, Information Quality, or Data Protection controls. Training in these things is seen as an overhead not an investment. Kludged manual processes are put in place without documentation and standardization (sure documentation takes AGES), and Ministers give undertakings to do things RIGHT NOW (immediately) rather than doing them RIGHT NOW (error proofed, proper controls, designed for efficiency, consistently executed).

This problem is not confined to the Public Sector. However the Public Sector is, as Billy Hawkes has pointed out many times, the one actor that processes our personal data who can REQUIRE us to provide information by law and which requires us to provide information to avail of key Public Services and functions.

“BLLLAAAAARRRRRPPPPP” is an insufficient response from the leadership.