Doorstep Dispensaree

This entry is part of 1 in the series Lessons Learned

Since it’s a New Year, and an opportunity presented itself, I’m trying something new. In this series, if it continues, I will be looking at various incidents and pulling out the lessons we can, should, or must learn in security. This first article looks at the first penalty levied by the ICO under GDPR, against Doorstep Dispensaree.

There are a few lessons to pull out of this case, but for those who want to look at the details themselves the full ICO penalty notice is available from their website.

The ICO identified a number of failings after they were called in by the MHRA, who had found some curious containers while executing a search warrant for their own investigation. In a courtyard behind the premises they found:

  • 47 unlocked crates
  • 2 disposal bags
  • 1 cardboard box

These unsecured containers, stored outside, contained around half a million documents – some water damaged – with personal information including medical and prescription information dating from 2016-2018. Shortly after they were informed, the Commissioner sent a request for additional information, a list which will be familiar to anyone who has banged their head against GDPR enough: data retention and disposal policies, privacy notice, explanation of why some data had been retained since 2016 and was stored in this way, and various other standard bits of evidence.

In response Doorstep Dispensaree did not cover themselves in glory by denying knowledge of the matter. Things escalated when they then refused to answer questions following a second request, apparently confusing the ICO and MHRA investigations. After an Information Notice was issued requiring the information, then appealed and upheld, they provided about half of the information claiming protection under the DPA 2018 that providing the rest would expose them to prosecution by the MHRA.

As part of their documentation, they did kindly provide the National Pharmacy Association template Data Protection Officer Guidance and Checklist, and Definitions and Quick Reference Guide. Other documents did not mention the GDPR, and the template were the original templates from the National Pharmacy Association.

Lessons

In all Doorstep Dispensary were found to have contravened Articles 5, 13, 14, 24 and 32 to some degree or another. It’s a good thing for them that no information was stolen, as the ICO would have been unlikely to look on them kindly if anything had happened – especially as given the serious compliance failures it is unlikely any notification of the breach would have been forthcoming until the MHRA investigation flagged it up, and certainly not within the required 72 hours.

You cannot delegate responsibility for data you control

Doorstep Dispensaree failed to dispose of the data securely, fairly clearly as some of it dated from 2016. An attempt was made to have the penalty assigned to a waste disposal company, blaming them for not having picked up the waste, which the ICO dismissed. While no evidence that the company was contracted was provided, Doorstep Dispensaree also failed to implement their own claimed shredding procedures, and in any case as the data controller they are ultimately accountable for the breach.

Information security is more than just confidentiality

Apparently storing sensitive personal information in unlocked boxes in an accessible yard is not considered secure storage or processing. As this is fairly obvious, the main point of interest here is that the ICO picked up on water damage to the documents being another failing. In information security terminology, there was a failure to ensure integrity and availability along with confidentiality. There is more to information security than simply locking data away.

Policies do not work retroactively

The ICO commented that eventually Doorstep Dispensaree did provide a more comprehensive set of policies. Unfortunately many were still in template form and had been acquired as a response to the investigation. The lesson here is that once an investigation has started, it’s probably too late to start downloading policy templates – best to put a framework in place beforehand.

Only keep data that needs to be kept

While the ICO were generous enough to only consider on-going infringement from May 2018, they remarked that the age of the data caused some concern about retention.

Be clear on your privacy notice

Your privacy notice requires you to state who the controller is and how to contact them, the nature of the processing and the basis (and with special category data the additional basis for processing), the categories of personal data you collect and work with, all parties involved in the processing of data, how long it is retained, the rights of the data subjects, and where personal data is collected. It must also be written in clear, unambiguous language and be freely available to data subjects.

A GDPR breach does not require that data has been lost or stolen

What’s particularly interesting about this case compared to many of the headlines about cyber-related data protection incidents, is that no data was stolen. The fact that it could have led to distress and damage to individuals is sufficient.

Trying to improve counts

The improvements Doorstep Dispensaree claims to be making have been taken into account, even though some of the policy documents presented are still in template format.

If you are found in breach, co-operate

It is very clear that the lack of early co-operation did more to hurt than help Doorstep Dispensaree’s case. Later co-operation and attempts to improve were taken into account, and the original proposed penalty of £400 000 was finalised as £275 000.

‘Insider’ Threat with Deepfakes

I have been neglecting this blog recently as finishing off my master’s thesis (now done, passed and everything), working on various tasks in a new role, and finishing off a contributed book chapter have been priorities. With two of those three taken care of I am hoping to have time to write a bit more on here again.

In that spirit, and as nice piece of synergy with the book chapter, a quick word about an incident reported today. I predict it will be the first of many.

Impersonation of senior executives via e-mail has become a well-established, and profitable, practice among both organised and disorganised cyber criminals. Some of the most advanced attempts go to extreme lengths of surveillance and co-ordination. Even the most basic mass attempts have seen some success. Whether to include this as insider threat is more a philosophical debate than one where we can draw a clear line, but I choose to include it.

With the advent of Deepfake algorithms it was inevitable that impersonation would go beyond email. Deepfake pornography created for blackmail or revenge purposes appeared shortly after the technology became available. Using it for impersonation has mainly been for entertainment purposes, until recently. As far as I know, this is the first published case outside of fiction of an attacker going this far in impersonation. That they did not think to spoof the source phone number, leading to early detection, is surprising.

There are possible solutions to this. Since we have yet to effectively solve the problem of impersonation by changing only the display name on e-mails I suspect they are a long way from widespread adoption.

Security Myths: Trading Functionality for Security

This entry is part 1 of 1 in the series Security Myths

One of the more common myths in security is that it will always compete with functionality of a system. There is some element of truth in the idea, but it’s often touted as a reason for not putting security in place which is something we need to move past as an industry.

The problem is that it is true, to a degree. Once a system is built layering security protections on top will always have some impact on functionality and usability, whether that’s by requiring additional authentication by users, restricting connectivity, or simply by impacting performance. And there lies the problem – it’s true when layering security on top of an existing system.

If a system is designed with security considerations from the ground up, placed on an equal footing at the beginning of requirements gathering as other functional and non-functional requirements, then there is no need to have a significant impact on the functionality of a system. In fact, given that one of the best ways to apply security at design is to simplify and automate, designing for both security and function will usually result in a more secure, more robust, and more functional system.

Yes, it does involve more up-front work, and some additional expense, but in terms of the effort saved on sticking web application firewalls, additional incompatible authentication mechanisms, and external monitoring systems among many other point solutions in front of a system quickly pays back that initial effort.

So more accurately, security will always be a trade off with functionality when it is applied too late to a system. When it is incorporated into requirements and built in as a core part of the design, this idea no longer needs to apply.

Going After the Big Phish

A while ago I did a presentation for one of the CTG events on Executive Security and Close Protection Technology. It was good to be able to go in with a slightly different take on things, as I was approaching it as a basic awareness of how people carry out whaling and phishing, trying to target senior executives, and what awareness an exec security team might want or need to have to protect their principal from this sort of decidedly non-physical attack.

For various reasons I hadn’t got around to making this one available until recently, after having heard James Linton, the man who phished the White House as a prank, speak. It prodded my memory and so, in case it’s useful to anyone, here’s the presentation.

GDPR and Consent

Despite what the non-stop wave of e-mails, text messages, and notices popping up everywhere would have you believe, GDPR is not about consent.

In fact as a rule of thumb what the legislation suggests is that consent should only be relied on to collect, store, or process personal data if you do not have any other legitimate reason to do so.

A quick look at the lawful bases provided for under GDPR, and the rights that each provides, show this quite well.

  • Consent: the subject has given clear consent to process their data for a specific purpose
  • Contract: processing is necessary to fulfil a contract with the subject, or to prepare a contract that they have requested
  • Legal obligation: processing is necessary to comply with the law (exception being contractual obligations)
  • Vital interests: processing is necessary to protect someone’s life (not necessarily the subject)
  • Public task: processing is necessary to perform a task in the public interest, or for your official functions
  • Legitimate interests: processing is necessary for legitimate interests of yourself or a third party (this does not apply to public authorities carrying out official tasks)

The only reason all of these consent notices are popping up everywhere is because none of the other bases apply. Mostly it’s about getting consent to profile you in order to better target advertising. So, when ticking those consent forms, or giving consent, it’s worth remembering that it isn’t down to GDPR that you’re now seeing them everywhere – it’s down to companies being desperate to use your personal information to more effectively persuade you to buy things or think a certain way, and not having any reason in your interest for doing so.

Santa and the GDPR

There’s a meme that’s been doing the rounds for a while, and with Christmas approaching has become especially popular.

Firstly article 4 is made up of definitions (https://gdpr-info.eu/art-4-gdpr/), so difficult to be in breach of, but let’s ignore that and run with the joke.

Lawful Basis

The first thing to do is to establish the lawful basis under which the processing is taking place. I’d argue that contract rather than legitimate interests applies here, since there is a standing agreement between parties (who believe in Santa) and Claus Corp. that belief and good behaviour will bring gifts (bad behaviour, obviously, bringing coal or similar). While the parties in question are under the age of capacity (in the vast majority) and cannot enter contracts themselves, in this instance their guardians can give consent to enter into and perform the gift/behaviour exchange contract on their behalf (https://gdpr-info.eu/art-8-gdpr/).

Now this does raise the question of children or other parties who do not believe. In this instance there is no mechanism for Claus Corp. to collect or process any data about these parties and so again I see no problems here. Data is only collected on those who believe in Santa, and therefore also agree to the gift/behaviour contract (or rather will have guardians willing to do so on their behalf).

There is an argument that legitimate interests could apply as a legal basis, but the additional responsibility, right to portability, and other requirements make it less suitable in this instance.

Right to Object

Under the contract lawful basis data subjects do have the right to object () to the processing of their data, meaning the processing must cease unless the controller can demonstrate legitimate grounds for the processing weighed up against the interests of the data subject. Of course, objecting would raise questions about gifts later being received, meaning that if an objection is received processing would cease in any case. Now it is important that in any correspondence with the data subject, or their guardian, Claus Corp. notify them of their right to object and the mechanism to do so, but this is a relatively simple requirement.

Notification Obligation and Restriction of Processing

Claus Corp. does have a responsibility to communicate any rectification or erasure of personal data, or restriction of processing (https://gdpr-info.eu/art-19-gdpr/). Of course, the contract basis does not provide an automatic right to erasure, we have already established that the processing is lawful, and Claus Corp. famously keeps their data worryingly up to date. The only instance that could apply is one in which the controller, Claus Corp. no longer needs the data for the purposes of processing but the subject wishes for the data to be retained for legal purposes. Given the unlikelihood of any legal case dependent on Claus Corp. data, and their extraterritoriality for any treaties allowing legal authorities access to their databases, this is not a scenario that needs to be examined in depth.

Automated decision-making and profiling

This is an area that could raise issues for Claus Corp. While the details of their decision-making process are obviously proprietary, it is safe to assume at least some level of automation is involved due to the sheer volume of data gathered (https://gdpr-info.eu/art-22-gdpr/). However, in this instance not only are no legal effects produced, but the processing is necessary to enter into and perform the gift/behaviour exchange contract. Whether the data subject (or their guardian(s)) give explicit consent is a more challenging question, as it is clear that by believing and communicating that belief to guardians they are making a clear and active decision to enter into the contract. I suspect this would be seen as explicit consent, but without test cases there is still some uncertainty here.

As such the best decision for Claus Corp. would be to provide a means for appeal to obtain human intervention (elf intervention is not counted as sufficient under the GDPR, and indeed elf processing of data in the first place may be considered automated decision making rather than manual, another instance where a test case would help to establish precedent).

So long as Claus Corp. makes some minimal changes to their working practices, and applies appropriate security for the data being processed (there is no indication or suggestion that they are not doing so), I see nothing to suggest they are in breach of any article of the General Data Protection Regulation.

Having said that I do have some concerns around the Tooth Fairy agency, who are clear processing sensitive medical data and provide no such clear channels for communication as Claus Corp., but that will wait for another article.

Initiative Q and Social Proof

Initiative Q is tomorrow’s payment network, an innovative combination of cryptocurrency and payment processor which will take over the world and make all of it’s early investors rich beyond their wildest dreams. Even better, those early investors don’t need to invest any of their money, just give an e-mail address and register. Then they can even invite friends and family (from a limited pool of invites) to take part too.

Since they’re not asking for anything (there have been suggestions it’s an e-mail harvesting scheme, but there are much easier, cheaper and quicker ways to simply grab e-mails and social data) there’s nothing for those initial investors to lose, so why not get involved.

Well, that’s the pitch anyway. I’m not going to comment on whether or not I think Initiative Q is malicious (I don’t) or a scam (that’s a longer discussion), just on what that initial investment involves and what social engineering they’re using here to expand their initial user base. It’s also important to note that the initial investors are definitely putting a stake into the game when they sign up, especially when they start inviting friends and family, and this is something that people should be aware of before jumping in.

I can’t imagine anyone not seeing the cynical manipulation of the front page, with its estimated future value of the next spot, so won’t dwell on it other than to say it’s a rather clever way to apply a time limitation on decision-making as it ticks down (and does at least make it clear that only 10% of the future-value tokens are available after registration, the rest depending partly on inviting five friends and unspecified future tasks). I’ll be honest, this type of manipulation does tend to get my alarm bells ringing.

A false sense of urgency and/or scarcity (and this has both – urgency with the value ticking down, scarcity on the small number of invites per person) is one of the classic pillars of social engineering. It tends to help convince by short-circuiting the decision-making process, since there’s no time to consider carefully and instead you must buy NOW!

The more insidious part to me is the idea that sign ups are not committing anything, therefore are safe. I blame social media for this one. What’s being invested isn’t financial worth, but individual’s social capital as they are used to convince friends and family. This is social proof in action – large numbers of people, smaller numbers of close friends and family, again is a way to let our lazy brains off the hook on examining a proposition, so that we act on autopilot and sign up.

Worse is that once someone sees themselves as an investor, with a substantial future stake, they are much more likely to aggressively defend the system if anyone tries to cast doubts on it (there have been a few examples of this recently), and refuse to accept any evidence that it might not be everything that was promised. By encouraging people to invite five friends you’re also promoting consistency – no one wants to go back on what they said to five of their friends (or strangers in a lot of cases) in public, and so any actual evidence against the system will be ignored or discredited.

So the weights that need to be considered aren’t financial – instead by signing up you are committing your individual integrity and social capital to this system, which has explicitly stated that all they’re after at this point is growth. It’s fair to say there is nothing particularly innovative about the Initiative Q technology, despite their claims, and the particular model has been tried before with less-polished marketing. Before signing up, please think carefully, remember that they aren’t promising anything (the initiative is very careful in their language to be conditional) while you’re putting a piece of your own identity into the scheme, committing to defend it against sceptics, to carry out unspecified future tasks, and to drag your friends in using your own credibility.

Why Work in Cyber Security? (Again)

Last year I gave a couple of presentations to some college students about why they really, really should look to work in cyber security, and how to get into it. That time’s rolled around again, and I realized I’d lost the original source document for the slides, so had to do a quick update from scratch. This is the result.