InfoSec view of the DSG Retail ICO fine
InfoSec view of the DSG Retail ICO fine
DISCLOSURE: I used to work for Dixons Carphone Group (DCG) around the time of this second breach with a senior security role, however I had resigned the week prior to further commotion about an incident happened internally and as such wasn’t kept updated of any privileged information and was learning more from the press than I was internally, which is also the reason why I’m comfortable in assessing the ICO notice on it.
This is going to be an overall view of ICO’s 33 page fine notice which can be found here. I think there are both good and some not-so-good things coming from this document which I have an opinion on, so writing it down and would love to exchange ideas. At the end, I’ll do a strategic consideration for other organisations with this new piece of case law.
There was no assertion made regarding the origin of the attack due to “the sophisticated nature and duration of the attack limited evidence collection” (14).
An interesting argument put forth by DSG relates to Cardholder data and that EMV (standard for chip-based card transactions) cards and as they only capture PAN (Primary Account number) and Expiry date, that it doesn’t constitute personal information as it wouldn’t enable the identification of account holder (which would make breach numbers reduce from 5.5M to 53k records). The ICO referred to Article 29 Working party to assert that the PAN alone constitutes personal data, so now we know.
DSG couldn’t determine an exact number but “estimated that in total they affected approximately 14 million data subjects”. This is definitely huge.
The timeline suggests that DSG came to know about this through external intelligence on the 5th April but the ICO was only notified on 8th June.
In their investigation, the Commissioner found an older pen-test report which had been done on POS terminal security dated 2017 which highlighted deficiencies in technical and organisational measures which “created real risks of such data breaches, and that they played an essential causal role in this particular incident”. This suggests that we really need to start making sure management pays attention to pen-test results and that it triggers an appropriate risk assessment and risk treatment plan, as it’s a contributing factor to concerns by the Commissioner.
I found it particularly interesting that the ICO (21) has decided to take account of PCI-DSS compliance state for the purpose of identification of suitable “technical and organisational measures” and as the “appropriate measure of security” in terms of that data.
“Although compliance with the PCI-DSS is not necessarily equivalent to compliance with the GDPR’s security principle, if you process card data and suffer a personal data breach, the ICO will consider the extent to which you have put in place measures that PCI-DSS requires particularly if the breach related to a lack of particular control or process mandated by the standard”
From (22) onwards, there’s the actual deliberation and listing of the factors which led to the issuing of the highest possible fine considering this was still under Data Protection Act 1998, and not under Data Protection Act 2018 which is generally aligned with the GDPR.
Lack of segregation — POS systems should be segregated from the rest of the environment. The assessment suggests both network segregation and Active Directory Forest segregation too, and points to a 2014 best practices document by Microsoft as justification for the latter.
My commentary: whilst it’s good that resources are being checked for appropriateness, I have mixed feelings about ICO’s suggestions of particular technical measures and “blind references” to best practices as opposed to performance and follow-up on risk assessments which should’ve been done and addressed. The measures are sensible, but I wonder what training and expertise the ICO has in this particular field
No local firewall configured — This is the most convoluted of the interactions between DSG and the ICO. Although there were “firewalls enabled and running in the wider system” it points to some guidance which I traced back here as justification. My “concern” is that it’s not clear enough whether it refers to Network vs Local firewalls or if in reference to other local firewalls too. If the latter, it would be a security anti-pattern as identified by NCSC — Anti Pattern 3 — Back to Back firewalls and not only not required but duplication of controls is to be generally avoided due to both tech operations and skills required to effectively manage controls.
Patching gaps — This is mostly expected as something that affects many different organisations. It does highlight a particular aspect. A particular vulnerability which required not only applying a patch but reapplying Group policies, which wasn’t done and as such left the vulnerability exploitable for circa 4 years. It’s really important not to just blindly apply patches, but actually understand what they are doing and what else is required to fix vulnerabilities.
Vulnerability scanning — none existed in the scope of the POS systems, and was considered a gap which “materially exacerbated” data security risks.
Lack of consistent application whitelisting — in this finding, ICO refers that only some of the tested systems actually had this control enabled so suggests lack of consistent application of controls. So a reference, not only to the control itself but also in how consistent organisations are in deploying and managing them.
No effective system of logging and monitoring — self explanatory. it would’ve been expected that alerting and possibly correlation of security events would help in identifying any breach activity.
Lifecycle management — this one has wider implications possibly. There’s a reference to outdated POS software, that was running 8 year old Java versions which the ICO “considers would place the POS terminals at increased risk of compromise” which is hard to argue with. Other similar businesses would be right in assessing their obsolescent IT for personal information processing and review their previous risk assessments on the subject.
P2Pe (Point to point Encryption on terminals) — this wasn’t in place but in process to be deployed. It’s particularly interesting the wording it uses as DSG mentioned a high cost of deployment for the P2Pe solution, which the ICO comments on with “cost of implementation of P2Pe was proportionate to the size of the business, the nature and volume of personal data being processed and the current standard of security at the relevant time”. This may signal that big organisations are or may be expected to a) ensure their risk assessments are clearer on potential impacts (proper DPIAs) and b) that security budget needs to be made available for big transformations of this kind.
References to own policies not being met— particularly in 9, there’s the comment on DSG having “failed to adhere to its own policies in respect of access permissions and passwords”. This is a bug-bear of mine for many many years now. Infosec policies created in isolation or without real understanding of technical capabilities and impact of the control on the use of the asset, then establish expectations that organisations can’t or won’t live up to creating an increased liability. Security teams need to stop creating security policies which are verbatum copies of security standards and/or uncontextualised best practices.
Lack of standard builds — mentions that DSG “failed to implement standard builds for all system components based on industry standard hardening guidance”. Especially with increased advent of Compliance-as-Code and increase in adoption of Cloud services, Configuration assurance is becoming more and more relevant as something you should be doing consistently.
Another interesting point to highlight that the ICO considered that all of the above constituted a contravention of Data Protection Principle 7 (security of information) individually, but that the assessment is done on the whole.
There’s commentary to the effect that the inadequacies “related to basic, commonplace measures needed for any such system” which is hard to argue with for the list above. It also posits that Cyber attacks are “likely to cause substantial damage and substantial distress” to individuals.
If I had to summarise learnings in a few headings, these would be the ones:
- Do what you say, say what you do — make sure that your security policies and standards actually reflect the intent, risk appetite and what you’re willing to manage exceptions for managing risk, Even if that means “”downgrading” how hard your policy statements are, policies far away from reality are just hurting our businesses even more
- Evidence risk assessments and review them regularly — For key parts of your system, be sure to keep evidence of risk assessments performed and the rationale to deliver or accept absence of controls or ways to deal with identified threats, so that if something happens you can just refer to existing information
- Consider baseline best practices and assert if/whether you need to apply them — I don’t believe a part of the problem here was so much not “having things” as it was “not being in the position to evidence you’ve considered having things, and put other compensating measures to reduce likelihood and/or impact of a loss event”
As I mentioned, I’m slightly surprised about the technical recommendations done by the ICO as they seem to be very prescriptive into what they believe would be appropriate controls, and mostly on references to other standards like PCI DSS. I would’ve preferred a bigger focus on the quality and process for the risk assessments other than the references to controls as it may have the inadvertent effect of increasing focus on security compliance in and of itself, as opposed to holistically address the underlying security of our systems.