Existing methodologies to analyse security risks connected to personal data processing, an overview
By Paolo Roccetti, Head of Cybersecurity research unit, Engineering Ingegneria Informatica
Personal data has become a fundamental asset for the functioning of digital economies. The collection, processing and sharing of personal data enables and supports the delivery of a large variety of services to customers, end-users and citizens at large. To deliver their services and products, organizations continuously collect, receive and manage personal data, i.e. “any information relating to a natural person who can be identified by reference to his identification number or to information which is specific to him”.
Because of its importance, personal data is central for many organization processes, and, similarly to other critical assets, they need to be adequately protected. Compromission of personal data security, e.g. by a succesful cyber attack, or by leakage from an insider, may severely impact the privacy of individuals and may lead to important economic and reputational consequences for the involved organization. As for other assets, risk-based approach to the governance of personal data security seems the more natural choice to pursue. The first step in this approach is to carry out the Privacy Risk Assessment: a process aims at identifying and analyse the threats connected to the collection, processing and other usages of personal data.
A number of methodologies have been already defined to carry out privacy risk assessments. In its Guidelines for SMEs on the security of personal data, ENISA proposed a methodology to assess security risks connected to personal data processes. The methodology is based on 5 steps depicted in the picture.
While the ENISA methodology is conceived for SMEs, it is also applicable to other organizations as well, including large ones. The methodology is linked to the publication of the GDPR, and it is meant to support data controllers and data processors in achieving compliance with GDPR by helping them to evaluate the relevant security risks (download the full report). The work primarily focuses on assessing security risks and it is not meant to carry out data protection impact assessment.
While ENISA focus on security, many other methodologies and tools focuses more specifically on assessing compliance to GDPR. One example is the PIA guidelines published by the CNIL, which is accompanied by the PIA software tool. The aim of this methodology and of the tool is to help data controllers to build and demonstrate compliance to the GDPR by facilitating the execution of data protection impact assessment.
On the US side, the NIST Privacy Risk Assessment Methodology (PRAM) applies the risk model from NISTIR 8062 and helps organizations, in particular US federal systems, to analyse, assess, and prioritize privacy risks to determine how to respond and select appropriate solutions. It states that Recognizing the boundaries and overlap between privacy and security is key to determining when existing security risk models and security-focused guidance may be applied to address privacy concerns—and where there are gaps that need to be filled in order to achieve an engineering approach to privacy. Therefore, as depicted in the image, privacy and security concerns are, to some extent, overlapping and there is a clear dependency between security aspects, essentially the Confidentiality of information, and the privacy of PII (Personally Identifiable Information).
Also from US, the FAIR Privacy (FAIR-P) is a quantitative privacy risk framework based on the more general Factor Analysis of Information Risk (FAIR) methodology. The methodology adopted a taxonomy of privacy harms which classifies a total of 15 privacy harms in four categories: information processing, information dissemination, collection and invasions, and, similarly to the NIST one, is accompanied by a set of spreadsheets that supports the adoption of the methodology. FAIR-P focuses on personal privacy risks (i.e. risks to individuals), not organizational risks.
One of the main challenge that ENCRYPT is addressing with respect of privacy risk assessment is the need to carry it out from two different, but intertwined, perspectives: the first one deals with the analysis of threats and impacts for the actors (Subjects, Data Controllers, Data Processors, etc.) involved in the personal data processing, while the second one deals with the IT systems used to perform the personal data processing. This second aspect is important to understand how cyber threats could affect the estimation of privacy risks. This combined approach requires the joint work of a pool of experts that includes the DPO, CISO as well as the process owner. This scenario is made even more complex by considering that the processing usually involves different entities, each one responsible for a part of the personal data security.
Also, the methodology being elaborated in ENCRYPT takes into account that, especially for cybersecurity aspects, the assessment must be carried out incrementally, i.e., by allowing actors to add and change further information at a later stage that triggers the re-calculation of privacy risks.