Category: The Business of Information

  • TV Licence checks and “Data Protection Principles” [updated]

    This morning’s Irish Times reports this morning that the (current) Irish Communications Minister  is seeking cabinet approval for powers to enable the agency that collects TV Licences (currently An Post, the Irish post office) to access subscriber koi data from subscription TV providers such as Sky or UPC to crack down on TV licence evasion. We are assured by the Minister that the whole thing will be done “ in accordance with strict data protection guidelines”. Ignoring for a moment that “Data Protection” is not a guideline but is a fundamental right of EU citizens enshrined in law and derived from both the TFEU and the European Charter on Fundamental Rights and implemented in Irish law as a result of an EU Directive (ergo… not a guideline but kind of a big thing to keep an eye on), what might those guidelines be?

    [Update] TheJournal.ie are reporting that this proposal has passed the Cabinet. The mechanism that is to be applied is reported as being:

    “An Post will be allowed access the subscription data held by the likes of UPC and Sky to cross-reference their subscriber databases with its own data on TV licence fee payers”

    I address the implications of this below in an update paragraph inserted in the original text. [/update]

    Guidelines

    In general Data Protection terms, once there is a statutory basis for processing (and access to data is processing) then the processing is lawful. What appears to be being proposed here is legislation that will allow subscriber data of one group of companies to be accessed by another company for the purposes of checking if someone is getting moving pictures on a telly box or similar device. So that’s the box ticked and we can move on, right? Oh, so long as we have protocols around the how, when, and why of access to the data right (because they are always followed)? And of course, the legislation will prevent scope creep in terms of  the use of the data and the potential sources of data that might be accessed using the legislation (e.g. telecommunications service providers who might have broadband going into a home or onto a device). Well, since April (and thanks to the great work of Digital Rights Ireland) we actually have some guidance from the Court of Justice of the European Union.

    This is guidance that Minister Rabbitte’s department should be distinctly aware of as it affected legislation that they are responsible for, the Communications Data Retention Directive (from which the Irish Communications Data Retention Act got its authority). In that case, the ECJ was very clear: any processing of personal data needs to be a proportionate for the outcome required. In the Digital Rights Ireland case, the ECJ felt that requiring the retention of call traffic and internet usage data on the off chance it might be useful to authorities to counter terrorism was a disproportionate response. Access to specific data would not be disproportionate, but wholesale data slurping was a breach of fundamental rights to data privacy as enshrined in the EU Charter of Fundamental Rights. This reasoning was followed by Hogan J in the recent case of Schrems vs The Data Protection Commissioner in the High Court where Hogan deftly summarises the constitutional, statutory, and EU Treaty bases for Data Privacy rights in Ireland and the EU.

    The upshot is that, regardless of the existence of a statutory authority to do a particular piece of processing, the processing itself must be a proportionate invasion of an individual’s right to Personal Data Privacy and their right to Privacy – two distinctly separate rights now under EU law. So, what would be a proportionate response in this context? How big is the problem?

    The Proportionality Conundrum

    According to the Minister, 16% of households don’t pay for a TV licence. According to ComReg 73% of households receive TV services via a subscription service. So 27% of people don’t pay for a TV service subscription and 16% don’t have a TV license, so there are more people who don’t have a paid TV subscription then don’t have a TV license? It is not outside the bounds of possibility that the ENTIRETY of the 16% that the Minister seeks to pursue are contained in the 27% that Sky and UPC would also love to separate from their subscriptions. Perhaps these people don’t have a television at all?

    Even assuming that the two groups are unrelated, the question of whether allowing An Post access to the subscriber lists of UPC and Sky is a proportionate response. It’s not. If it is not a proportionate response for serious offences under the now defunct Data Retention Directive to allow law enforcement blanket access to telecommunications call history and internet usage data, it is probably not proportionate for a private company to have access to the subscriber lists of potential competitors (who knows what An Post might want to pivot into, given they are in the telecommunications business ) for the purposes of detecting where people don’t have a TV license.

    [Update] Based on a report on TheJournal.ie, it appears that what is proposed is an en masse cross checking of data between An Post’s TV License database and the databases of Sky and UPC.  This is borders, in effect, on a form of mass surveillance. It is, in my opinion, that this would be unlikely to be seen as a proportionate response to the problem. This is particularly the case where alternatives to the bulk access to data can achieve the same overall objective without the need for the data to be processed in this way. [/update]

    What would be proportionate would be for An Post to be able to make a request, on a case by case basis, for confirmation if a property which does not have a TV license is in receipt of a subscription TV service, once there was a detection that there was someone resident at the address or a business operating at the address which had a receiving device (i.e. a TV). Sky or UPC would simply need to respond with a “Yes they have service” or “No they do not” with no other data being accessed.

    A wrinkle though…

    One wrinkle is that Sky and UPC are not just TV service companies. They are telecommunications service providers as well. They provide home phone and broadband services. So the scope of the potential legislation is to allow a telecommunications company (An Post) access to the subscriber data of other telecommunications companies. This raises significant issues from a Data Protection perspective under SI336 ,where telecommunications providers have very serious security obligations to their subscribers around notifying of potential security issues on their network and also notifying subscribers and the Data Protection Commissioner where there has been a breach of data security.

    It also raises the spectre of other telecommunications companies being required to provide the same data, depending on how the legislation is drafted.

    Almost inevitably, the telecommunications providers would be asked to provide data to An Post about users who were accessing particular types of services or IP addresses (e.g. RTE online services or TV3 Player, or Netflix, or similar). This is EXACTLY the type of data that the ECJ has ruled on in the Digital Rights Ireland case. Proportionality raises its head again, along with the need to avoid information security breaches on the part of the telecommunications companies being asked to provide access to their data.

    The Upshot

    At this remove I can identify a few mechanisms that would be a proportionate interference in personal data privacy rights, and would minimise the risks of unauthorised access to or disclosure of subscriber data by a telecommunications service provider.

    1. An Post would need to make their requests as part of an investigation of a specific instance of an offence with a view to prosecution. Each request would need to relate to the investigation of a specific offence (“Mr X, at address Y, has no TV license but has a receiving apparatus he claims is not connected to any service, please verify he is not a subscriber”). The subscription TV service providers or Telecommunications service providers would simply respond back with a “Yes” or “No” to the specific question. But that answer may not confirm if they use their broadband to access streamed broadcast services. It is very easy to mask internet usage by using VPN tunnelling services, so the net may not catch all the fishes the Minister is trawling for.
    2. Another option would be to simply add the cost of the TV license to the subscription fee for Sky or UPC television services and, potentially, to the cost of broadband services in the State.  This would require zero sharing of data and a single annual transaction between the service providers and the State. It would also avoid entirely the risk of unauthorised access to or disclosure of subscriber data as a result of An Post (or any other entity) having access to subscriber data.

    (Of course, just because you have a broadband connection doesn’t mean you are watching TV programmes on your device. I have a good friend who has a very large computer monitor and watches DVDs streamed from a laptop. They have broadband. For email, internet access, and work stuff. Their TV and movie viewing is entirely DVD boxed set driven.  A mechanism would be required for people in that category to opt-out, unless this is a flat-rate tax on telecommunications services flying under a false flag. That is a matter for a different blog post.)

    What ever approach is ultimately taken it will need to constitute an invasion of data privacy that is proportionate to the problem that presents itself. THAT is the Data Protection requirement that must be met. It is not a guideline. It is the law, and it is a matter of fundamental rights.

    For the Minister to view Data Protection as a “guideline” further evidences the horridly discordant tone at the top in the Irish State about Data Protection (which I’ve written about here and here and here and here).

  • Facebook, Manipulation, and Data Protection – part 2

    Right. Having gotten some day job work out of the way I return to this topic to tease out the issues further.

    One aspect that I didn’t touch on in the last post was whether or not Data Protection exemptions exist for research and if those exemptions apply in this case. This discussion starts from the premise that EU Data Protection law applies to this Facebook research and that Irish Data Protection law is the relevant legislation.

    The Exemption

    Section 2(5) of the Data Protection Acts 1988 and 2003 provides an exemption for processing for research purposes:

    (a) “do not apply to personal data kept for statistical or research or other scientific purposes, and the keeping of which complies with such requirements (if any) as may be prescribed for the purpose of safeguarding the fundamental rights and freedoms of data subjects.

    And

    (b) “the data or, as the case may be, the information constituting such data shall not be regarded for the purposes of paragraph (a) of the said subsection as having been obtained unfairly by reason only that its use for any such purpose was not disclosed when it was obtained, if the data are not used in such a way that damage or distress is, or is likely to be, caused to any data subject

    The key elements of the test therefore are:

    1. The data is being processed for statistical or scientific purposes
    2. And the processing of the data complies with requirements that might be prescribed for safeguarding fundamental rights and freedoms

    This means that for research which is being undertaken for scientific purposes with an appropriate ethics review that has identified appropriate controls to safeguard fundamental rights of Data Subjects, which since the enactment of the Charter of Fundamental Rights in the EU includes a distinct right to personal data privacy. This was reaffirmed by the Digital Rights Ireland case earlier this year.

    The question arises: was the Facebook study as scientific purpose? It would appear to be so, and in that context we need to examine if there was any processing requirements set out to safeguard fundamental rights and freedoms of Data Subjects. That is a function of the IRB or Ethics committee overseeing the research. Cornell University are clear that the issues of personal data processing were not considered in this case as their scientists were engaged in a review and analysis of processed data and they did not believe that there was human research being undertaken.

    Whether or not you consider that line of argument to be Jesuitical bullshit or not is secondary to the simple fact that no specific requirements were set out from any entity regarding the controls that needed to be put in place to protect the fundamental rights and freedoms (such as freedom of expression) that the Data Subject should enjoy.

    Legally this means that the two stage test is passed.  Data is being processed for a scientific purpose and there has been no breach of any provision set down for the processing of the data to safeguard fundamental rights, so consent etc. is not required to justify the processing and the standard around fair obtaining is looser.

    Apparently if your review doesn’t consider your research to be human research then you are in the clear.

    Ethically that should be problematic as it suggests that careful parsing of the roles of different participants in research activity can bypass the need to check if you have safeguarded the fundamental rights of your research subjects. That is why ethics reviews are important, and especially so when it comes to the ethics of “Big Data” research. Rather than assessing if a particular research project is human research we should be asking how it isn’t, particularly when the source of the data is identifiable social media profiles.

    A Key Third test…

    The third part of the test is whether or not the data is being used in a way that would cause damage or distress to the data subject. This is a key test in the context of the Facebook project and the design of the study. Consent and fair obtaining requirements can be waived where there is no likelihood of damage or distress being caused to the research subject.

    However, this study specifically set out to create test conditions that would cause distress to data subjects.

    It may be argued that the test is actually whether or not the distress would be measured as an additional level of distress that would be caused over and above the normal level of distress that the subject might suffer. But given that the Facebook study was creating specific instances of distress to measure a causation/correlation relationship between status updates and emotional responses, it’s hard to see how this element of the exemption would actually apply.

    Had Facebook adopted a passive approach to monitoring and classifying the data rather than a directed approach then their processing would not have caused distress (it would have just monitored and reported on it).

    The Upshot?

    It looks like Facebook/Cornell might get off on a technicality under the first two stages of the test. They were conducting scientific research and there was no prerequisite from any Ethics committee to have any controls to protect fundamental rights. However that is simply a technicality and it could be argued that, in the absence of a positive decision that no controls were needed, it may not be sufficient to rely on that to avail of the Section 2(5) exemption.

    However, it may be that the direct nature of the manipulation and the fact that it was intended to cause distress to members of the sample population might negate the ability to rely on this exemption in the first place, which means that consent and all the other requirements of the Data Protection Acts should apply and be considered in the conduct of the research.

    The only saving grace might be that the level of distress detected was not found to be statistically large. But to find that they had to conduct the questionable research in the first place.

    And that brings us back to the “wibbly-wobbly, timey-wimey” issues with the consent relied upon in the published paper.

    Ultimately it highlights the needs for a proactive approach to ethics and data privacy rights in Big Data research activities. Rather than assuming that the data is not human data or identifiable data, Ethics committees should be invoked and required to assess whether the data is and ensure that appropriate controls are defined to protect fundamental rights. Finally, the question of whether distress will be caused to data subjects in the course of data gathering needs to be a key ethical question as it can trigger Data Protection liability in otherwise valuable research activities.

  • Facebook Research, Timeline Manipulation, & EU Data Protection Law

    This is an initial post based on the information I have to hand today (1st July 2014). I’ve written it because I’ve had a number of queries this morning about the Data Protection implications of Facebook’s research activity. I’m writing it here and not on my company’s website because it is a work in progress and is my personal view. I may be wrong on some or all of these questions.

    Question 1: Can (or should) the Data Protection Commissioner in Ireland get involved?

    Facebook operates worldwide. However, for Facebook users outside the US and Canada, the Data Controller is Facebook Ireland, based in Dublin. Therefore EU Data Protection laws, in the form of the Irish Data Protection Acts 1988 and 2003 applies to the processing of personal data by Facebook. As a result, the Irish Data Protection Commissioner is the relevant regulator for all Facebook users outside the US and Canada. The key question then is whether or not Facebook constrained their research population to data subjects (users) within the US and Canada.

    • If yes, then this is not a matter for investigation by EU data protection authorities (i.e. the Data Protection Commissioner).
    • If no, then the Irish Data Protection Commissioner and EU Data Protection laws come into play.

    If Facebook didn’t constrain their population set, it is therefore possible for Facebook users outside of the US and Canada to make a complaint to the DPC about the processing and to have it investigated. However, the DPC does not have to wait for a complaint. Section 10 of the Data Protection Acts empowers the Commissioner to undertake “such investigations as he or she considers appropriate” to ensure compliance with legislation and to “identify any contravention” of the Data Protection Acts 1988 and 2003.

    [update] So, it is clear that the data was obtained from a random sample of facebook users. Which raises the question of the sampling method used – was it stratified random sampling (randomised within a sub-set of the total user base) or random sampling across the entire user base? If the former then the data might have been constrained. If the latter, the data inevitably will contain data subjects from outside the US/Canada region. [/update]

    Answer: If Facebook hasn’t constrained their population to just North America (US/Canada) then… Yes.

    Question 2: If Irish/EU Data Protection Law applies, has Facebook done anything wrong?

    Tricky question, and I wouldn’t want to prejudge any possible investigation by the Data Protection Commissioner (assuming the answer to Question 1 would get them involved).  However, based on the information that is available a number of potential issues arise, most of them centred on the question of consent. Consent is a tricky issue in academic research, market research, or clinical research. The study which was conducted related to the psychological state of data subjects. That is categorised as “Sensitive Personal Data” under the Data Protection Acts. As such, the processing of that data requires explicit consent under Section 2B of the Acts. Beyond the scope of the Data Protection Acts, clinical research is governed by ethical standards such as the Nuremburg Code which also requires a focus on voluntary and informed consent:

    The voluntary consent of the human subject is absolutely essential… and should have sufficient knowledge and comprehension of the elements of the subject matter involved as to enable him to make an understanding and enlightened decision. This latter element requires that before the acceptance of an affirmative decision by the experimental subject there should be made known to him the nature, duration, and purpose of the experiment

    Question 2A: Was Consent Required? Consent is required for processing of sensitive personal data. For that data to be sensitive personal data it needs to be data that is identifiable to an individual and is sensitive in nature. However, if the data being processed was anonymised or pseudonymised then it falls outside the scope of personal data, assuming appropriate controls are in place to prevent re-identification. The Irish Data Protection Commissioner has published guidance in 2007 on Clinical Research in the Healthcare sector which provides some guidance on the question of consent, albeit from the perspective of a pure clinical healthcare perspective. A key point in the guidance is that while anonymising data may remove the Data Protection question around consent, it doesn’t preclude the ethical questions around conducting research using patient data. These kind of questions are the domain of Ethics Committees in Universities or commercial research organisations. Research of this kind are governed by Institutional Review Boards (IRB) (aka Ethics Committees).

    Apparently Cornell University took the view that, as their researchers were not actually looking at the original raw data and were basing their analysis of results produced by the Facebook Data Science team they were not conducting human research and as such the question of whether consent was required for the research wasn’t considered. The specifics of the US rules and regulations on research ethics are too detailed for me to go into here. There is a great post on the topic here which concludes that, in a given set of circumstances, it is possible that an IRB might have been able to approve the research as it was conducted given that Facebook manipulates timelines and algorithms all the time. However, the article concludes that some level of information about the research, over and above the blanket “research” term contained in Facebook’s Data Use policy would likely have been required (but not to the level of biasing the study by putting all cards on the table), and it would have been preferable if the subjects had received a debrief from Facebook rather than the entire user population wondering if it was them who had been manipulated. Interestingly, the authors of the paper point to Facebook’s Data Use Policy as the basis of their “informed consent” for this study:

    As such, it was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research.

    Answer: This is a tricky one. For the analysis of aggregate data no consent is required under DP laws and, it appears, it raises no ethical issues. However, the fact that the researchers felt they needed to clarify that they had consent under Facebook’s Data Use policy to conduct the data gathering experiments suggests that they felt they needed to have consent for the specific experimentation they were undertaking, notwithstanding that they might have been able to clear ethical hurdles over the use of the data once it had been obtained legally.

    Question 2b: If consent exists, is it valid? The only problem with the assertion by the researchers that the research was governed by Facebook’s Data Use policy is that, at the time of the study (January 2012) there was no such specified purpose in Facebook’s Data use policy. This has been highlighted by Forbes writer Kashmir Hill.

    The text covering research purposes was added in May 2012. It may well have been a proposed change that was working its way through internal reviews within Facebook, but it is impossible for someone to give informed consent for a purpose about which they have not been informed. Therefore, if Facebook are relying on a term in their Data Use Policy which hadn’t been introduced at the time of the study, then there is no valid consent in place, even if we can assume that implied consent would be sufficient for the purposes of conducting psychological research. If we enter into a degree of speculation and assume that, through some wibbly-wobbly timey-wimey construct (or Kashmir Hill having made an unlikely error in her analysis), there was a single word in the Data Use Policy for Facebook that permitted “research”, is that sufficient?

    For consent to be valid it must be specific, informed, unambiguous, and freely given. I would argue that “research” is too broad a term and could be interpreted as meaning just internal research about service functionality and operations, particularly in the context in which it appears in the Facebook Data Use Policy where it is lumped in as part of “internal operations”. Is publishing psychological and sociological research part of Facebook’s “internal operations”? Is it part of Facebook’s “internal operations” to try to make their users sad? Interestingly, a review of the Irish Data Protection Commissioner’s Audit of Facebook in 2012 reveals no mention of “Research” as a stated purpose for Facebook to be processing personal data. There is a lot of information about how the Facebook Ireland User Operations team process data such as help-desk queries etc. But there is nothing about conducting psychometric analysis of users through manipulation of their timelines. Perhaps the question was not asked by the DPC?

    So, it could be argued by a Data Protection regulator (or an aggrieved research subject) that the consent was insufficiently specific or unambiguous to be valid. And, lest we forget it, processing of data relating to Sensitive personal data such as psychological health, philosophical opinions etc. requires explicit consent under EU law. The direct manipulation of a data subject’s news feed to test if it made them happier or sadder or had no effect might therefore require a higher level of disclosure and a more positive and direct confirmation/affirmation of consent other than “they read the document and used the service”. There are other reasons people would use Facebook other than to be residents of a petri dish.

    Does this type of research differ from A/B testing in user interface design or copywriting? Arguably no, as it is a tweak to a thing to see if people respond differently. However A/B testing isn’t looking for a profound correlation over a long term between changes to content and how a person feels. A/B testing is simply asking, at a point in time, whether someone liked presentation A of content versus presentation B. It is more functionally driven market research than psychological or sociological analysis.

    Answer: I’d have to come down on the negative here. If consent to the processing of personal data in the manner described was required, it is difficult for me to see how it could be validly given, particularly as the requirement is for EXPLICIT consent. On one hand it appears that the magic words being relied up on by the researchers didn’t exist at the time of the research being conducted. Therefore there can be no consent. Assuming some form of fudged retroactivity of consents given to cover processing in the past, it is still difficult to see how “research” for “internal operations” purposes meets the requirement  of explicit consent necessary for psychological research of this kind. It differs to user experience testing which is more “market research” than psychological and therefore is arguably subject to a higher standard.

    Question 3: Could it have been done differently to avoid Data Protection Risks

    Short answer: yes. A number of things could have been done differently.

    1. Notification of inclusion in a research study to assess user behaviours, with an option to opt-out, would have provided clarity on consent.
    2. Analysis of anonymised data sets without directed manipulation of specific users timelines would not have raised any DP issues.
    3. Ensure validity of consent. Make sure the text includes references to academic research activities and the potential psychological analysis of user responses to changes in Facebook environment. Such text should be clearly highlighted and, ideally, the consent to that element should be by a positive act to either opt-in (preferred) or to opt-out
    4. Anonymise data sets during study.
    5. Restrict population for study to US/Canada only – removes EU Data Protection issues entirely (but is potentially a cynical move).

    Long Answer: It will depend on whether there is any specific finding by a Data Protection Authority against Facebook on this. It does, however, highlight the importance of considering Data Protection compliance concerns as well as ethical issues when designing studies, particularly in the context of Big Data. There have been comparisons between this kind of study and other sociological research such as researchers walking up to random test subjects and asking them to make a decision subject to a particular test condition. Such comparisons have merit, but only if we break them down to assess what is happening. With those studies there is a test subject who is anonymous, about whom data is recorded for research purposes, often in response to a manipulated stimulus to create a test condition. The volume of test subjects will be low. The potential impact will be low. And the opportunity to decline to participate exists (the test subject can walk on by… as I often did when faced with undergrad psychology students in University) With “Big Data” research, the subject is not anonymous, even if they can be anonymised. The volume of test subjects is high. Significantly (particularly in this case) there is no opportunity to decline to participate. By being a participant in the petri-dish system you are part of the experiment without your knowledge. I could choose to go to the University coffee shop without choosing to be surveyed and prodded by trainee brain monkeys. I appear to have no such choice with Data Scientists. The longer answer is that a proper consideration of the ethics and legal positioning of this kind of research is important.

  • Stand up for Digital Rights, Ireland.

    In the Western world our rights are under attack. In the UK for example the policy of the Tory party is to abolish the Human Rights Act (http://www.bbc.co.uk/news/uk-politics-21726612). In the fast changing world of data and information private companies and governments alike go to great lengths to peer inside our digital lives in a manner often disproportionate to or ineffective for the stated purposes of ‘national security’ or copyright enforcement. The revelations over the summer from Edward Snowden, and a variety of other stories relating to the use, misuse, and abuse of our private personal data by companies and governments alike have resulted in Dictionary.com making “Privacy” its Word of the Year for 2013 (http://blog.dictionary.com/privacy/)

    Last year saw the Irish Government, in its presidency of the European Union, preside over a significant watering down of rights and protections for individual data privacy in the proposed EU Data Protection Regulation. This regulation was subject to 4000 proposed amendments and one of the most intrusive lobbying campaigns by organisations seeking to reduce the protections over personal data privacy afforded to EU citizens. But last year also saw Digital Rights Ireland punch significantly above it’s weight on the European stage, with their appeal to the ECJ on the retention of telephone, sms, and internet usage data by telecoms companies on behalf of governments – precisely the same information that was at the centre of Snowden’s PRISM disclosures.

    Digital Rights Ireland plays a valuable role in the evolution of our personal digital rights, particularly as we struggle to define where we must draw the line between an Information Economy, where the users of services are the means of production, and an Information Society, where powerful tools for communication and interaction allow us to engage, but to wear a mask or withdraw to our personal fortresses of solitude where we can define and redevelop our sense of self as people. Not as products.

    However, DRI had one set back in 2013 which puts their ability to stand up for our rights, your rights, in an Information Society. They were on the losing side in litigation about copyright issues. Their role in the case – to be a counterpoint voice for the people and to bring additional information and perspective to the Court. The impact: the music industry looked for costs of the guts of €30,000 against DRI for one day in Court. This was reduced to €13,000 on appeal to the Taxing Master. No other party to the case is seeking costs against DRI.

    The risk now is that DRI might be liquidated by the music industry representatives. For standing up and suggesting alternative solutions might be needed, for pointing out how web filtering is easily circumvented, and basically being a devil’s advocate on the side of the individuals who make up our society.

    Money must be found. DRI runs on a shoestring, favours, and jellybabies. There is no salary for its directors,  no top ups, no big dinners or extravagant radio adverts. Just people who care and give up time from their day jobs to provide a voice for Digital Rights. That voice will fall silent if they cannot raise the €13,000 needed as soon as possible.

    It is time to stand up for Digital Rights, Ireland. Rather than buying a data slurping tablet in the sales, or downloading another privacy invading smartphone app\tracking device, go to www.digitalrights.ie and check out what they do for you. Then go here (http://www.digitalrights.ie/support-us-in-2014/) to learn more about their problem. Then go here http://www.digitalrights.ie/support/ to donate, either a once off payment or a recurring donation.

    And if you don’t, you risk waking up one day as a just another unit of production in an Orwellian dystopia.

  • DPC, Prism, Safe Harbor and stuff

    The Irish DPC has come under fire in the international media on foot of their failure to act on a complaint by Europe v Facebook about US multinationals with bases in Ireland allowing data to be accessed by the NSA.

    The gist of EVF’s complaint is that this access invalidates Safe Harbor and therefore makes the transfer of data by these companies to the US is therefore illegal.

    EVF may indeed be right. The key 2-legged test to be passed is whether the access by law enforcement/national security agencies to the data that is being transferred is necessary for the national security/law enforcement purpose, and whether the access/processing is in turn proportionate to the objective when balanced against the fundamental right to privacy.

    Prism and similar programmes quite probably fail either or both legs of that test. Certainly the ECJ seemed to be very concerned with whether European governments had done enough to demonstrate necessity and proportionality with regard to EU communications data retention (http://www.contentandcarrier.eu/?p=435).

    This is the ECJ case that the Irish DPC refers to in the written response to Europe-v-Facebook.

    Safe Harbor is a scheme entered into by the European Commission and the US Dept of Commerce to facilitate transfers of data to the US. It is decidedly imperfect and had been the subject of criticism since it was introduced in 2000.

    It is one of the mechanisms under which organisations can transfer personal data outside the EEA (28 EU member states plus Norway, Iceland & Liechtenstein) under S11 of the Data Protection Acts

    S11 does give the DPC the power to prohibit such transfers in certain circumstances. The DPC needs to be of the view that data protection rules are likely to be contravened and individuals are likely to be harmed as a result. This power is limited in that it does not apply where the transfer is required or authorised by law.

    And here’s the rub:

    • Safe Harbor is a scheme that authorises the transfer. So the DPC can’t unilaterally prohibit the transfer of data where Safe Harbor is being applied.
    • The Irish DPC does not have statutory authority to second guess the EU Commission on the legality of Safe Harbor
    • PRISM is, at this time, understood to have a statutory basis in the US and no-one court has yet ruled on the necessity and proportionality of its data gathering, so there is no breach of Data Protection rules per se. If the ECJ gives guidance re similar EU laws this could alter things.

    In short, the Irish DPC’s hands are probably tied by the law.

    Billy Hawkes lacks the legal authority to rule on the validity of Safe Harbor, so while transfers under Safe Harbor are valid in the EU Commissions eyes he probably can’t prohibit a transfer that is based on Safe Harbor. That is probably for the EU Commission to do.

    Nor is he empowered to make a finding of fact against the NSA regarding the necessity and proportionality of their processing (that’s for the US courts, or for the EU Commission to adopt as part of their review of Safe Harbor) – but will be bound by whatever principles of proportionality and necessity for communications meta-data processing emerge from the ECJ Data Retention Directive case, which is likely in my view to be more of a steer to the EU Commission regarding controls that would be required in “Son of Safe Harbor” than empowering the DPC to torpedo Safe Harbor himself.

    I suggest that it is this reasoning which the German DPAs have applied in their action which has had the effect of prohibiting transfers in scenarios where they had direct competence but served only to send up a warning flare that Safe Harbor and Model Contract Clauses might be broken – but DPAs lack the statutory competence to actually do anything about it and it must be addressed by the Commission.

    Rather than “regulator fails to enforce law”, this story is more correctly “Regulators hampered by broken law unsuited for modern age”

  • The DPC, Prism, and the Tech Giants (updated)

    Europe v Facebook has issued a press release today decrying the failure of the Irish DPC to find fault with the reliance on Safe Harbor by US technology companies in the transfer of personal data of EU citizens to the US where it fell into the net of PRISM.

    The soundbite friendly position evf is taking is that the Irish DPC is kowtowing to economic interests in not pulling the plug on Safe Harbour as German DPAs have done.

    However I would suggest that the position is slightly more nuanced than that. The key test that needs to be met for the national security/law enforcement exemptions to Safe Harbour is one of necessity and proportionality of the invasion of privacy set against the national security/law enforcement requirement.

    The EU currently has a Data Retention Directive. It is law in most EU member states, but is currently subject to an action in the Irish High Court which has referred questions to the ECJ, which ultimately rest on issues of necessity (I.e is it necessary to retain the metadata of every call, web access, email, sms sent over a comms provider in the EU, and if it is necessary is it proportional to do so for EVERYONE compared to the actual risk/objective).

    This ECJ action is referred to explicitly by the DPC in their response to evf.

    In the absence of a ruling in that case or a decision by the EU commission that PRISM constitutes an unnecessary and disproportionate intrusion under Safe Harbour the DPC is acting in line, in my view, with the law that is in front of him.

    But the Germans have pulled the plug I hear you cry! Yes. They have – to a point. But the German Constitutional Court has also struck down their national implementation of the EU Communications Retention Directive. So the law in Germany is slightly but significantly different.

    But this awkward disjointment of laws highlights the need for improved standardisation of Data Protection laws in Europe and an improved collegiate operating structure for DPAs. This is part of what the revised General Regulation on Data Protection was to deliver.

    It also highlights the questionable justification for double standards for law enforcement as illustrated by the existence of the parallel revisedDirective on Data Protection for EU law enforcement agencies which differs from the draft Regulation.

    As a childhood (and adult) fan of the classic TV show “Yes, Minister” I’m minded to give the DPC some benefit of the doubt in their position as it would be preferable for there to be an EU bloc position on Safe Harbor rather than piecemeal action. That requires either EU Commission termination of Safe Harbor due to its abuse on grounds of inappropriate and unnecessary intrusion, or a ruling from the ECJ that defines those rules in an EU context in regard of our own data sucking activities.


    After a little digging, it turns out that the position of the German DPAs doesn’t differ all that much from the Irish position. They actually haven’t suspended Safe Harbor, just called on the European Commission to clarify how Prism etc is compatible with EU privacy principles.
    http://www.huntonprivacyblog.com/2013/07/articles/german-dpas-halt-data-transfer-approvals-and-consider-suspending-transfers-based-on-safe-harbor-eu-model-clauses/
    What is suspended are transfers based on any other basis other than model contract terms or Safe Harbor.

    So, in effect, the German DPAs have kicked the ball back to the European Commission in a manner similar to the Irish DPC, but have forgotten to mention the significant ECJ hearing as well.

    That is not to say that I am thrilled with how it has been handled. The DPC should have issued a formal decision on this setting out their position so that evf could appeal against it in Court. That would be an interesting case to see and I suspect many of the arguments that would need to be put forward have already been drafted in respect of Digital Right Ireland’s High Court and ECJ actions.

    Of course, I don’t rule out the possibility of an overworked under resourced Data Protection authority making an error in their assessment of the legal position. And, unfortunately given the dischordant “tone at the top” from Alan Shatter on matters Data Protection the political landscape Billy Hawkes must navigate is challenging.

    This will get very interesting I suspect.

    (And I’ve left the question of whether the Irish DPC even has the powers under the domestic legislation to do what evf are requesting for another day)

  • Buying back the mortgaged off

    Today’s Irish Times has a ‘news’ story about a man who, during the boom, sold his home and land for €3million and has just bought it back for €215,000.

    Fair play to him. He sold a property and home he loved and made a profit. Now he can have his cake and eat it, returning wealthier to the same home and hearth.

    The same, unfortunately, is not true of protections for fundamental human rights. In the current economic turmoil it is tempting to mortgage them or sell them off in the interests of supporting business and reducing red tape. However, when the economy recovers it will probably be impossible to push the pendulum back towards respecting the rights we have forgone in the interests of economic expedience. We will have a recovered economy but a diminished society.

    This is what is happening with the EU Data Protection Regulation. Earlier this month the Irish Government, in one of the last acts of their EU Presidency, trumpeted their ‘victory’ in the first four chapters of the Regulation, getting a quasi kind of agreement to introduce a level of protections that has been watered down to near homeopathic levels. Whatever good is in some of the proposals the Irish Government is horribly undermined and hollowed out by the move to a purely “risk based” model of regulation (similar to that which has worked so well in Financial Services) amongst other things.

    I’ve written about that in detail here with Fergal Crehan.

    Principles diluted do not retain the memory of the principle. Homeopathic regulation doesn’t work. The parts of the Regulation that might have served to retain focus and concentration were the sections around enforcement and penalties.

    Today we learn via a leaked document that these sections have likewise been diluted to homeopathic levels by the Irish EU Presidency (again, annoyingly in tandem with some good and positive changes)

    • The specific levels of fines to be levied have been omitted from the document (Dr. Chris Pounder on the Hawktalk blog suggests this may be due to there being no agreement, my view is that if it has been taken out whatever is put back in will be a lot less attention focussing than the 2% of global turnover levels previously proposed)
    • A range of mitigating factors and considerations have been introduced which must be considered by a Data Protection Authority before levying a penalty of any amount. 13 different factors to be considered. One for every tooth a Regulator might have had. One more line of defence to be argued over before enforcement can commence.

    So, errant Data Controllers may now be in a position where they can self-assess their risks based on their own perception of the risk and impacts of their actions (just like people of a certain generation used to self-assess whether they were sober enough to drive), but just in case they get it horribly wrong the hoops a Regulator will have to jump through before being able to levy any form of meaningful penalty have grown in number and vagueness.

    This the text book definition of light touch regulation. History has shown repeatedly, and at great cost, that this simply does not work.

    The man in the newspaper today bought back his old family home and made a tidy profit because of a catastrophic failure of culture, governance, and regulation. Rules around due diligence and proper management of lending were set aside or worked around because it was “good for business”.

    We must learn the lessons of history or we will have mortgaged our rights to be “left alone” in the interests of economic expedience and only those who held on to their financial muscle in this crisis will be able to make the payment needed to buy back that right through the Courts.

    An appropriate balance must be struck between the economy and the society.

  • An Op-Ed about Data Protection

    Fergal Crehan and I drafted the original version of this op-ed piece on the evening of the 5th of June, completing it on the 6th and submitting it immediately to the Irish Times as a topical opinion piece. The article was originally drafted in response to the EU Council of Ministers publication of proposed amendments to the EU General Data Protection Regulation that would significantly undermine the protections awarded to individuals and their data under EU law.

    It wasn’t published (but them’s the breaks as they say).

    I’ve updated it to include reference to the Prism and Tremora stories that were just beginning to break the week the original piece was drafted. I’ve also included references to some anti-data protection stories that have appeared in the Irish Times since the beginning of June, and a nod to the legacy of light touch regulation and associated attitudes that has recently emerged in the Irish press.

    I took the decision in consultation with Fergal to publish this here as the points that are raised are important ones regarding the nature of the society we want to live in. The failure of the Irish Times to fact check recent stories raises a further question as to the role of a neutered as opposed to neutral press in the definition of and shaping of that society.

    Journalists more than anyone should be alert to and resisting of any efforts to dilute or invade privacy, because it is only where there is privacy that there is the freedom for sources and whistle blowers to express privately (to journalists) facts that should be made public by the media. The logging of data about what numbers you dial, when, where from, and the uses that data can be put to could conceivably jeopardise sources, result in stories that need to be told being silenced, and force public and private conformity with a “party line” regardless of consequences. “All the President’s Men” would have been a significantly different movie if Nixon had had access to a Minority report level of analytics about who called who and who was where when – which is possible today.

    A Free Press should be concerned in equal measure about attacks on the freedom of expression and the rights to Privacy. This is why Data Protection should be a hot topic of relevance, not a dry techie story of limited interest. Responsible journalists need to inform themselves of the rights that exist, the ways those rights are being undermined, and how the existence of those rights that are under threat.

    A skewed balance struck

    For some years now, the EU has been preparing a regulation to update and standardise data protection law in Europe. The expectation was that the rules would be strengthened, giving citizens more protection against misuse of their information. It was a shock then, when the Irish Presidency brought forward a draft regulation which not only dilutes many of the original proposals of the EU Commission, but represents a neutering of many data protection rights rights enjoyed up until now.

    Data protection is a human right, closely bound up with privacy, and is unsurprisingly taken especially seriously by European countries whose citizens suffered under the police states of Nazis or Soviets, or even both. It is the right not to have your personal information hoarded, sold, disclosed or otherwise misused. “Data Protection” may not stir passions like other rights do, but in an increasingly data driven world, its importance cannot be overstated. We are already at risk of a two-tier privacy system, where the rich and famous can go to court for super-injunctions, while Joe Citizen cannot sit peacefully at home without their phone ringing with unwanted direct marketing calls.

    Ireland has had the privilege of shepherding the revised Data Protection rules through the process of negotiation and agreement. The vision set out by the European Commission in its initial drafts was to provide a simplified regulatory structure for business and strengthened rights for individuals over how, where, and why information about them is processed, and by whom. This vision became the subject of one of the most intensive lobbying campaigns by US firms ever seen in the EU.

    In February it emerged that amendments tabled by a group of MEPs that diluted the protection of personal data were copied verbatim from the submissions of these lobbyists. Sean Kelly, the Irish MEP responsible for those amendments, recently received an award from an advertising industry group for his work. The Council of Ministers recently issued a set of proposed changes to the Regulation that are being touted by Alan Shatter, the outgoing President of the Justice and Home Affairs Council, as providing “better protection for citizens” while also “providing a better strategy and architecture for business”.

    However, privacy advocates have highlighted that while the proposed changes are good for business they are a serious weakening of protections EU citizens have historically enjoyed. Advocates in favour of the proposed changes cite the importance of data in the modern economy and the potential for jobs.

    But are we building an economy or a society? In a speech this week President Michael D. Higgins tells us that the EU is a “union of citizens” and the institutions of the EU must work to protect those citizens. The proposed Regulation weakens those very protections.

    The proposed changes introduce a “risk based”, self-regulation approach. This seems not unlike the “light touch” regulation which was adopted in order to attract financial services companies to Ireland, and which fuelled the financial services boom. With our government now keen to attract more data-based firms like Facebook and LinkedIn to Ireland, it seems lessons of recent history are not being learned. And in the week of the Anglo Tapes it is more important than ever that we learn these lessons.

    This approach has been hailed as “non-prescriptive”. But a regulation that doesn’t prescribe anything is a mere suggestion, which can and will be ignored unless there are adverse consequences. Ireland’s Data Protection Commissioner is chronically underfunded, but he can and does bring prosecutions for breach of the Data Protection Acts. It is difficult to see how a these kinds of criminal convictions could be achieved under the proposed regulation. 

    Under the proposed Regulation, if your personal data is lost or stolen, the decision about whether to tell you will be left in the hands of the people who lost the data. This effectively means that there will be no right to know when your personal information is lost.

    Last year Target, the US supermarket, broke the news to a father that his teenage daughter was pregnant by sending her unsolicited targeted adverts for baby products. Current laws make this potentially illegal in Europe. However, direct marketing rules are to be changed under the proposed Regulation. Companies would no longer need your permission to market to you once they have obtained your data. This is an extraordinary win for the marketing lobby, a turn from a right to privacy, to a right to invade privacy. The telemarketer, a scourge familiar to any American with a phone, is set to become an unwelcome part of our daily life too.

    The recent revelations of unfettered and covert surveillance on the private commmunications of every individual in every country by US and UK intelligence services has highlighted the risks of the Panopticon. Some argue that if you have nothing to hide you have nothing to fear. But that flies in the face of our fundamental values that everyone has a right to a place where they can have private thoughts and private communications. These rights are under attack and must be defended.

    But at a smaller scale, recent articles in the Irish Times have linked Data Protection rules with inefficiencies in the Ambulance service which have contributed to deaths. ‘Data Protection rules mean we can’t use GPS for ambulances’ was the claim. Bunkum is the answer. Such processing is permissible under Section 8 of the Data Protection Acts. ‘Data Protection rules will curtail genealogy’ was another claim. Again, bunkum. The draft Regulation will likely apply only to living persons, Public Registers will have certain exemptions, and the Right to be Forgotten is not a right to be airbrushed from history, as has been made clear by Commissioner Reding on many occasions, and has been made clear by the ECJ in the past week.

    Data is hailed as “the new oil”. “Big data” is mined to predict everything from musical taste to voting habits. It is disturbing when rights, once considered uncontroversial, are watered down or neutralised because it has become profitable to do so. What is proposed in this draft of the Regulation is something unprecedented in the history of the EU – the effective abolition of a human right enshrined in EU Treaties. As citizens, we can only wonder and worry which other human rights will become inconvenient to big business, and what their fate will be.

  • My email to Irish Times Editor, sent 25th June

    Below is the text of an email I sent to the Irish Times editor on the 25th of June. The email was received by the Irish Times systems but I have had no response. Hugh Linehan on Twitter engaged but just to refer me to the Editor. I’ve published the letter here for wider reference. Readers might want to check out posts by Fergal Crehan and myself here and on fergalcrehan.com

    [update] I feel that this email raises important questions, particularly in light of the article on lobbyists and astroturfing and the EU Data Protection Regulation in today’s Financial Times.[/update]

    Dear Mr O’Sullivan

    Over the past few days a number of stories have appeared in the Irish Times purporting to highlight important Data Protection issues. In all cases the reporting has been at best incomplete, with no validation of claims made or any attempt to present counterpoints or other relevant facts, and at worst a simple retreading of a press release without any apparent fact checking or questioning of the information being spoonfed to the correspondent.

    On the 22nd June the Irish Times ran a story headlined “Ambulances unable to use GPS tracking” which drew a connection between alleged Data Protection restrictions and the death of a child. http://www.irishtimes.com/news/ambulances-unable-to-use-gps-tracking-1.1438980. The statement of data protection law contained in this article was incorrect. A number of sections of the Data Protection Acts specifically allow for processing of and disclosure of personal data, particularly where there is a risk to the safety, life, or health of an individual.

    A cursory Google search or request for comment to either the Office of the Data Protection Commissioner or a specialist in Data Protection law and practice such as myself could have clarified this. Specifically disclosure of/processing of GPS data would be permitted under Section 8(d) and Section 8(f) of the Data Protection Acts in the case of an emergency services requirement.

    As someone with experience in the telecommunications sector and Data Protection issues, there are other more fundamental problems with real-time GPS tracking and, unfortunately, life is not like an episode of CSI where there is perfect information available in perfect real-time with perfect accuracy. This could and should have been reflected in the article. The real barrier to accurate dispatch of ambulances is the failure of successive governments to roll out a post-code system or equivalent address identification system that would allow for more granular and accurate location of addresses. An Post’s Geodirectory (which is the defacto standard for address validation) is designed for postal delivery not ambulance dispatch. Post codes were to be implemented in 2008.

    On the 24th June the Irish Times ran a story headlined “EU Regulation could restrict genealogical research” http://www.irishtimes.com/news/eu-regulation-could-restrict-genealogical-research-1.1440075, which reported that the revised EU General Regulation on Data Protection could restrict access to parish records and other genealogical data such as registers of births, marriages, and deaths.

    Again – this is utter bunkum. The EU Data Protection Regulation is unlikely to apply to deceased persons (as is the case with the current Irish Data Protection legislation which excludes the deceased, but is not the case in some other EU countries). Furthermore the Right to Be Forgotten has been defined and discussed thus far in a circumspect manner as to exclude Public Registers such as parish records or Registries of Births, Marriages, or Deaths. Yes, Data Protection rules will and do apply to genealogists working with data relating to living people, but only insofar as the data cannot be used for other purposes and other obligations to keep data safe and secure.

    While I acknowledge that the EU Data Protection Regulation is as yet not finalised I would submit that that makes it even more important that responsible reporting on the actual or potential future trends in EU Data Protection law and the rights of citizens should be balanced and facts and assertions checked and validated.

    Today (25th June) the Irish Times business section ran a story heralding that the ASAI would be introducing rules requiring organisations using online advertising behaviour tracking to provide notice of this from September. http://www.irishtimes.com/business/sectors/media-and-marketing/firms-to-give-notice-if-collecting-online-data-for-ads-1.1440129 This appears at first glance to be a good news story about self-regulation in the Internet Advertising industry, with the Interactive Advertising Bureau holding a consumer awareness campaign from today.

    However the ASAI’s rules merely reflect what the law of the land ACTUALLY IS AS OF JULY 2011. Under SI336 organisations making use of cookies or similar on-line tracking are required to disclose this fact and secure explicit consent, particularly where that tracking will take place across multiple websites. Unlike the ASAI’s non-statutory enforcement powers, SI336 is enforced by the Data Protection Commissioner’s Office with breaches warranting penalties of up to €5000 on summary conviction or €250,000 on indictment.

    Again – a simple fact check on this story would have highlighted the existence of this legislation and raised questions about why the ASAI is suddenly taking an interest in cookies. It would, of course, have highlighted that the Irish Times was one of a number of organisations contacted by the Data Protection Commissioner last year with regard to compliance with SI336 http://www.dataprotection.ie/viewdoc.asp?m=&fn=/documents/press/listwwebsites.htm

    So is the real story here not why the ASAI, with limited enforcement powers, feels it is important to step in to the policy and enforcement role of the Office of the Data Protection Commissioner, rather than simply ensuring its members comply with what is required under a law that is 2 years old? Is the DPC grinding to a halt? Is the Advertising Industry attempting to put lipstick on the pig that is self-regulation? Why is the ASAI seeking to confuse people about who to complain to about breaches of Cookies Regulations (them or the DPC or both)? Why?

    There is a worrying pattern in these stories. The first two decry the Data Protection legislation (current and future) as being dangerous to children and damaging to the genealogy trade (a Fr Ted-like “Down with this sort of thing” positioning). The third sets up an industry “self-regulation” straw man and heralds it as progress (when it is decidedly not, serving only to further confuse consumers about their rights).

    If I was a cynical person I would find it hard not to draw the conclusion that the Irish Times, the “paper of record” has been stooged by organisations who are resistant to the defence of and validation of fundamental rights to privacy as enshrined in the Data Protection Acts and EU Treaties, and in the embryonic Data Protection Regulation. That these stories emerge hot on the heels of the pendulum swing towards privacy concerns that the NSA/Prism revelations have triggered is, I must assume, a co-incidence. It cannot be the case that the Irish Times blindly publishes press releases without conducting cursory fact checking on the stories contained therein?

    Three stories over three days is insufficient data to plot a definitive trend, but the emphasis is disconcerting. Is it the Irish Times’ editorial position that Data Protection legislation and the protection of fundamental rights is a bad thing and that industry self-regulation that operates in ignorance of legislation is the appropriate model for the future? It surely cannot be that press releases are regurgitated as balanced fact and news by the Irish Times without fact checking and verification? If I was to predict a “Data Protection killed my Puppy” type headline for tomorrow’s edition or another later this week would I be proved correct?

    Attached is an updated copy of an op-ed piece on Data Protection reform I submitted in collaboration with Fergal Crehan BL earlier this month (06/06/2013). It remains unpublished. If it helps, I’ll dress it up as a Press release and send it to the news desk instead.

    Yours

    Daragh O Brien

  • More unchecked Data Protection guff in the media

    Today’s Irish Times carried a story in the Business section that the ASAI, self-described on their website as the "self regulatory body" for the advertising industry in Ireland, have issued guidelines on the use of cookies in behavioural advertising which will come into effect from September.

    Great news but for a few minor facts that seem to have eluded the fact checking doubtless done by the journalist taking the by-line.

    • The ASAI is a voluntary self regulatory body. It is not a statutory agency
    • The use of cookies, especially for online behavioural advertising or tracking is covered under SI336, a piece of Data Protection legislation that came into effect in July. 2011. (i.e. 23 months ago).
    • The DPC has already begun enforcement proceedings to encourage compliance. Among the organisations written to late last year was the Irish Times

    So, the ASAI is essentially claiming credit for encouraging its members to comply with the law of the land 27 months late. This is presented unquestioningly in the article as a "good thing" being done by a responsible self-regulating body. But the ASAI is just moving to bring their members into line with the law. Late.

    In doing so they muddy the waters for consumers by making it seem that they are the entity to complain to (they’re not – it’s the DPC, who can levy actual criminal penalties and fines). While the ASAI’s move to regulate the on-line data gathering practices of its members is laudable, responsible journalism would have pointed out that that is what the law actually is and this is not a proactive industry response.

    "Look at us! Self regulation can work!" is the implied message. (That’s exactly the message by the way that has emerged from lobbyists who campaigned to dilute the protections for individual rights in the Draft EU Data Protection Regulation, and also the message that was trotted out in other industries in recent years with less than stellar results).

    Taken in combination with a number of "data protection kills puppies" stories that the Irish Times has been running recently one can’t help but form the view that, in the absence of proper fact checking by journalists someone is st00ging the Irish Times and distorting the paper of record.

    After all, this publication of unchecked errors as fact couldn’t possibly be editorial policy (could it?)