The definition of privacy-related specifications is crucial in the design process of any system which is subject to the GDPR. Privacy-related requirements can be seen as qualitative non-functional requirements which result in additional functional requirements during the specification process. As the application of GDPR is basically assessed by means of risk analysis of data treatments, a quantitative aspect of evaluation is anyway needed: consequently, defining a quantitative approach to privacy-related specifications is desirable, and suitable tools should be identified or provided. While there is some analogy with the field of security and the field of dependability, so that tools might be somehow and to some extent borrowed from those domains, the privacy domain also requires that human factor must be modeled, and external influences on human factors should be modeled as well. In this sense, risk can be evaluated similarly to what can be done in the security field, and this is actually done in the privacy domain by approaches like DPIA, but nudging and deterrence play a different role and are worth some reflections.In this paper we discuss this perspective on privacyrelated specification and discuss the use of a tool, Pythia, which is not related to the risk analysis domain but can be profitably used, in our opinion, to define privacy policies as complementary to privacyaware systems design cycles and to assess their impact. We present an improved analysis of a model from our previous research to show that this point of view on privacy-aware systems design is peculiar and should be considered in the design processes.
EVALUATING THE EFFECTS OF NUDGING AND DETERRENCE ON USERS’ BEHAVIOR FOR PRIVACY-BY-DESIGN
Mastroianni M.
2024-01-01
Abstract
The definition of privacy-related specifications is crucial in the design process of any system which is subject to the GDPR. Privacy-related requirements can be seen as qualitative non-functional requirements which result in additional functional requirements during the specification process. As the application of GDPR is basically assessed by means of risk analysis of data treatments, a quantitative aspect of evaluation is anyway needed: consequently, defining a quantitative approach to privacy-related specifications is desirable, and suitable tools should be identified or provided. While there is some analogy with the field of security and the field of dependability, so that tools might be somehow and to some extent borrowed from those domains, the privacy domain also requires that human factor must be modeled, and external influences on human factors should be modeled as well. In this sense, risk can be evaluated similarly to what can be done in the security field, and this is actually done in the privacy domain by approaches like DPIA, but nudging and deterrence play a different role and are worth some reflections.In this paper we discuss this perspective on privacyrelated specification and discuss the use of a tool, Pythia, which is not related to the risk analysis domain but can be profitably used, in our opinion, to define privacy policies as complementary to privacyaware systems design cycles and to assess their impact. We present an improved analysis of a model from our previous research to show that this point of view on privacy-aware systems design is peculiar and should be considered in the design processes.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.