Technological changes, particularly in the context of big data, have made surveillance by public and private parties easier than ever before: they have reduced the costs of gathering, storing and disseminating information. This has been coupled with a decentralization in internet content creation. Together, these changes modify the interactions involving privacy and personal information exchanges and, to that extent, they force us to reconsider the scope of protection that we grant them. This reconsideration has been done from the perspective of human rights, while the economic incentives involved remain underexplored.

In the process of doing this, the thesis evaluates whether data protection law can be justified from an economic perspective. Given that people face privacy costs by disclosing personal information, entitlements created by data protection affect the incentives for generating information in the context of decentralized content creation; hence, these entitlements can lead to greater information production in the long run. Due to this, privacy and access to information are often complementary rights. Determining the efficient protection level for these entitlements, however, be comes complex, as both property rules and liability rules introduce additional problems. An intermediate protection level is hence suggested.

Following this, the thesis gives an explanation of why people sometimes disclose their personal information for low compensations despite the high value that they attach to their privacy, based on the uncertain probability of privacy breaches. This explanation can account for user behavior within a rational-choice framework in a way that fits intuitively with both consumers’ demands for transparency and contemporary policy debates on privacy. It also reverses the prevalent behavioral model’s policy conclusions and accounts for current trends in data protection law—particularly the right to be forgotten.

The right to be forgotten is then analyzed focusing on its formulation in the General Data Regulation Proposal. The right creates large social costs, mainly by reducing freedom of expression and access to information. Due to its implementation difficulties, it could also introduce a risk-compensation mechanism in which people engage in more risky behavior than before. From this perspective, Google v. Spain does not rule on the right to be forgotten but on the liability of search engines; and in doing so, it fails to offer a consistent balance between privacy and freedom of expression.

The second major policy debate analyzed is the limitations placed on online tracking by the Electronic Communications Framework Directive, which changes the default system for tracking to an opt-in. A comparative study of the directive’s implementation across member states is presented, with special attention to The Netherlands and the United Kingdom. Drawing from the behavioral economics literature on default rules, policy changes that would avoid the incentive problems present in these regulations are suggested.

The thesis makes the dynamics of the tradeoffs involving privacy more visible; both theoretically and in two of the main policy debates in European data protection law. It offers an explanation for data protection law from an economic perspective and, in doing so, provides a new basis for the evaluation of further data protection measures.


Additional Files
Cover-image.PNG Cover Image , 831kb