The latest Eurobarometer published in December 2016, reflecting the perceptions of the European citizens on privacy and security in telecommunications shows that, although people are not always informed on the privacy regulations or the implications of privacy breaches, they demand specific privacy protection. In particular, citizens want their data, their communications and the data that they give or outsource to online services to be well protected and not shared with unwanted parties. The demands of the public can be partly covered by the application of the privacy-by-design principle and the use of Privacy Enhancing Techniques (PETs) in commercial applications.
The privacy-by-design principle requires application designers to gather only the personal data that are essential to the correct operation of their applications. That is, applications following this principle should only ask the users to input those personal data that the specific application explicitly needs. Most applications in the market (and especially smartphone applications) clearly disregard this principle ---a look at the permissions they request is conclusive---: they collect contextual information too, even if it is not needed. The privacy policies of major service providers explain that all data may be used for commercial purposes.
Although major service providers collect more data than strictly necessary, the privacy-by-design principle demands that users be empowered with the decision about when to grant access to their data, when to modify them and when to delete them. While this is more or less being taken into account by service providers, sometimes the procedures to modify or delete personal data are not transparent enough or too cumbersome.
Privacy Enhancing Techniques (PET) are cryptographic and non-cryptographic tools that, when used appropriately, minimize the amount of personal data being handled by applications, and therefore help developers to more easily comply with regulations on personal data processing. Therefore, research on privacy enhancing techniques and on the practical deployment of the privacy-by-design principle is backed by the demands of the general population.
In this work we aim at demonstrating that, if appropriate techniques are used, privacy does not necessarily work against security and/or utility. We focus on three specific application cases described below:
- Group discounts are offered by vendors and public authorities to encourage a more sustainable (or profitable) way to access their services or use public resources. An example of this are high-occupancy vehicle (HOV) tolls in highways, which offer discounts for vehicles carrying more than a given number of passengers (2 or more, 3 or more, etc). There are several ways to ascertain the number of members of a group: employees at access points that count them, cameras that take photos and analyze them in toll booths, or registration procedures that require the names of all members of groups, among others.
We argue that automated mechanisms, such as cameras and registration procedures, take more information from the participants that is actually needed (thus violating the privacy-by-design principle), and that the only really necessary information is the size of the groups.
- Loyalty programs are marketing efforts implemented by vendors, especially retailers, that are aimed at establishing a lasting relationship with consumers. In a loyalty program, the vendor pursues two main goals: i) to encourage the consumer to make more purchases in the future (returning customer); ii) to allow the vendor to profile the consumer in view of conducting market research and segmentation (profiled customer). In order to lure consumers into a loyalty program, the vendor offers them rewards, typically loyalty points that consumers can later exchange for discounts, gifts or other benefits offered by the vendor.
Normally, enrollment to loyalty programs involves some kind of registration procedure, in which customers fill out a form with their personal information and are granted a loyalty card, be it a physical card (magnetic stripe or smartcard) or a smartphone application. Although loyalty programs have become widespread, they are experiencing a loss of active participants and they have been criticized by business experts and consumer associations. Criticism is mainly due to privacy issues, because it is not always clear whether the benefits offered by vendors in their loyalty programs are worth the loss of consumer privacy caused by profiling.
- Implicit authentication refers to a software system authenticating individuals based on the way they interact with their device, i.e. their behavior. In this context, the user's behavior can be determined by collecting a variety of features, such as keystroke patterns, browser history and configuration, IP addresses, location, visible antennas, etc. Implicit authentication can be viewed as a complement of the usual explicit authentication based on identifiers and credentials.
Note that a common trait in these three application cases is that users need to prove something about themselves or their context without revealing more than what is strictly necessary. We believe these cases can be used as an example for other applications in which the goal is similar.
The main contributions of this thesis are: 1. A group size accreditation method that preserves anonymity of the members of the groups. The anonymity provided by the scheme is congurable.
The method rests on two building blocks: (a) A new parameterized key management scheme for identity-based signatures that allows setting the anonymity level of users by providing them with multiple keys that are shared by many other users, but that are extracted from a unique identity.
(b) A novel IBDT signature scheme based on asymmetric bilinear pairings, that combines the properties of identity-based and threshold signature schemes. Signatures produced with this scheme reveal only the public keys of the group members, which are called identities, and the size of the signing group. The signature scheme is efficient, and the sizes of the signatures are constant.
2. A privacy-preserving loyalty program protocol suite, whereby vendors can issue and verify loyalty points, and customers can maintain their anonymity and con gure the level of generalization for their purchase receipts before submitting them for additional loyalty points. This allows vendors to still carry out client profiling in a privacy-aware way. This protocol suite combines the following techniques: (a) A new construction for anonymous (untransferable) tokens with controlled linkability based on partially blind signatures and zero-knowledge proofs. The construction allows issuing and verifying tokens, while the verifier cannot link tokens to a specific user or between concrete executions of the issuance and verification procedures, unless such a linkage is authorized by the user. Moreover, if a hardware based keystore is available, the tokens can be made untransferable, so that only users who originally received the tokens can submit them.
(b) Generalization techniques to select the level of anonymization of purchase receipts.
3. A mechanism to compute the distance between user profiles (expressed as feature sets of different data types) based on the size of the intersectionof the feature sets.
4. A privacy-preserving implicit authentication mechanism using the homomorphic properties of the Paillier cryptosystem, that protects the privacy of the sensitive data in the user's profile and ensures that the server does not learn anything about the user's behavior.
5. A second privacy-preserving implicit authentication with similar functionalities and higher speed compared to the previous one, based on the intersection of Bloom filters. While this mechanism provides slightly less protection than the previous one, its substantially better performance makes it ideal for implementation in existing authentication suites.
© 2008-2024 Fundación Dialnet · Todos los derechos reservados