Privacy Enhancing Technologies: Difference between revisions
Line 124: | Line 124: | ||
[[Category:Authentication Use Cases]] | [[Category:Authentication Use Cases]] | ||
[[Category:Use Cases]] | [[Category:Use Cases]] | ||
[[Category:Profile]] |
Latest revision as of 01:45, 15 May 2021
Use Case Metadata
Title
Privacy enhancing technology generic use case. In some ways this is similar to the functional models produced in other working groups. It definitely shows the relationships among the different levels of abstraction in a developing ecosystem.
Status
Use Case Lifecycle Status
Contributed | Working Draft | Committee Review | Compilation | Approval | Publication |
This use case is available for review by the Use Case AHG with the goal of refining and completing the use case, see the Use Case Catalog for the review schedule. When the use case meets the completeness and NSTIC guiding principles criteria and is approved by the use case AHG, it will enter the Committee Review phase. |
Use Case AHG Review Status
Initial posting.
Use Case Category
Privacy, Trust/Assurance, Interoperability
Contributor
Tom Jones
Use Case Content
Use Case Description
Privacy is considered as one of the core requirements of the IDESG and yet it has proven difficult to accommodate in the trust frameworks. This use case is designed to provide a framework for thinking about privacy and the various places where privacy-enhancing technologies (PET) might fit within a Trust Framework. Privacy is known to be a burden to those enabling services on the internet. Nearly everyone claims to provide it, but no one seems to know the best practices necessary to assure compliance with privacy concerns. This use case considers a taxonomy of the various proposed ways to implement privacy technologies. By itself technology cannot ensure privacy. At best technology can provide the means for actors of good-will to be in compliance with the expectations of users and regulators.
As used here a Privacy Enhancing Technology is one that takes claims that link the identified used in an online transaction to real world entities and produces claims that do not have such linkages. To be successful, a PET provider needs to provide claims that satisfy a relying parties needs with claims containing only attributes that are released by the identified user. The process of taking attribute claims in one syntax and releasing them in privacy preserving syntax is also known as claims translation. Claims translation can provide other services to a relying part such as accepting claims in one protocol and producing claims in another protocol that is in use by the relying party.
Actors
- User: In this case a human being that wants to access services of a relying party and still retain privacy for details that are not needed by the RP.
- Identity or Attribute Provider (IAP) contains identities and attributes of users. For this case it subsumes the Attributes Providers and Identity Providers both exist to supply claims to the PET Provider.
- Relying Party (RP): A service provider that needs a collection of claims to provide that service. The claims may relate to financial responsibility or other user attributes that are required by regulation to met legal responsibilities. It is beyond the scope of this use case to determine whether the RP actually has any justification in requesting any user attribute at all.
- Privacy Enhancing Technology (PET) is a function that can be implemented by a variety of actors listed above.
- Identity Ecosystem: a set of services that implement other trust services as required by the rules of that ecosystem. Note that all of the other actors are almost certainly required to function with multiple identity ecosystems; some, but not all, of these ecosystems are expected to be compliant with IDESG trust frameworks.
Goals / Actor Stories
- Compliance with regulations for RPs and IAP (Identity or Attribute Providers).
- Common method for reliably describing attribute data requested and the user's intent to release those attributes to one particular relying party.
- High comfort level for users that their personal data is only shared when and with whom they want it shared
Benefits to each actor:
- The User is the party most concerned about privacy, but with a history of doing the least about it.
- The Identity Provider can use the additional features of selective disclosure of attributes as a means to increase its attractiveness to users, even though that might reduce their value to advertisers.
- The relying party generally wants to know all possible user attributes as a means to increase their ability to monetize their collected data about users. But the RP is the most likely actor to suffer from a data breach as so is incentivized to show "due diligence" in compliance with user privacy best practices.
Assumptions
- The IDESG principles will ebabke an identity ecosystem consisting of multiple trust frameworks that satisfy the needs of specific affinity groups. Since users need to communicate with different affinity groups from time to time, they will typically need to accommodate different trust frameworks during the normal course of daily computer use.
- The RP has a relatively clear set of privacy compliance regulations to follow that can be satisfied by one or more IDESG trust frameworks.
- The RP will support one or more IDESG trust frameworks and/or known providers for the user to chose from.
- The available NSTIC approved ID trust frameworks can be enumerated in such a way that the RP can specify one (or more) that it accepts.
- It is expected that at least one NSTIC approved ID trust framework with a specific level of assurance will offered to the user by the RP.
- For the foreseeable future user choices will surely include legacy identity providers that are listed by name.
- Privacy enhancing technology will be available to meet the privacy stipulation of the IDESG. The specific location of the privacy enhancing technology is not specified in this general use case. Wherever the PET provider is located, the role it provides in protecting privacy should have the same effect from the user's perspective.
- The Privacy Enhancing Technology Provider must be trusted by the relying party to deliver verified claims about the user irrespective of whether the real world identity of the user is provided to the relying party.
- Privacy enhancing technologies may exist associated with multiple entities in any given transaction and must not interfere with each other or degrade the experience for the user.
- It is expected that each trust framework will come with a set of rules and approved independent labs that can attest to the PET provider based on the trust frameworks that are supported by the provider.
- The privacy enhancing technology will obtain policy settings from the user to determine what specific types of information may be released to the relying party. Some implementations may preserve different settings for each relying party the user visits.
Process Flow
- The user establishes an account with one or more IAPs each of which assinge a Subject ID to the user. In this case there is no need to distinguish between identity providers and other attribute providers.
- The user accesses a web site which requires identity attributes of some sort to continue to process the user request. That web site then becomes a relying party.
- The RP gives the user a choice on which system or provider to provide identity.
- This request for information is intercepted by privacy enhancing technology.
- Determine if the information is available
- Determine if the user has already authorized release to this RP
- Display any remaining choices to the user to acquire more attributes or release those already available.
- Format the set of requested claims into a response in a way the RP can evaluate the claims.
- Send the response to the RP who has sole responsibility to determine if sufficient identity has been proved to provide the request access.
- Repeat these steps till the RP is satisfied or one side gives up.
Success Scenario
- The PET is located in a user agent under control of the user. Whether the user agent is running on the user device or in the cloud must not matter.
- The PET is located in a Relying Party (RP) agent under contract to the RP, but following the privacy standards of the IDESG to prevent release of unauthorized user data or of linkage between the user, the Identity or Attribute Providers and the User.
- The PET is located in a Identity or Attribute Provider that is following the privacy standards of the IDESG to create tokens that contain only attributes release by the user to this particular RP. This has also been called a Meta-Identity System or an Identity Oracle [[1]]
- A PET is located in more that one location and still protects the information released by the user while providing claims that satisfy the relying party.
- Each user will be able to find a PET provider that suites their own circumstances as each person will have different tolerance for intrusion of a PET into their own identity experience on the internet.
Error Conditions
- User does not have credentials acceptable to the relying party.
- Mitigation: The PET redirects the user to one or more sources of appropriate credentials that do meet the criteria.
- Mitigation: The relying party redirects the user to one or more Identity Providers or trust frameworks that are acceptable. If a new framework is chosen, that may change the PET to meet those particular requirements.
Relationships
The Device Integrity is defined in the "Device Integrity supporting User Authentication Use Case" at [[2]]
References and Citations
- COPPA is the Children's Online Privacy Protection Act that is well described in the following site: [[3]]
- Privacy Enhancing Technologies [[4]] are mainly based on either:
- obfuscation by using shared bogus accounts or pseudonymous identifiers
- data minimization by either limiting the flow of data or by creating unlikable claims which, include means to hide the user's identity with a cryptographic technology. The following reference describes U-Prove and Identrix cryptographic protocols to achieve this goal: [[5]]
- The simplest technology to block linking between the relying party and the identity and attribute providers is token translation which has the additional benefit of hiding differing authentication technologies that may be used by the relying party and the various providers. Security Token Services typically provide token translation services. For example see the following site: [[6]]
NSTIC Guiding Principles Considerations
Privacy Considerations
Privacy enhancement is the core of the purpose of this use case. One particularly challenging problem is the case of minors under the age of 13 that are covered by COPPA. Those challenges are left for another use case.
It is known that search terms alone are sufficient in many cases to allow identification of the user. In any service that collects attributes or behaviors of the user over time, only policy enforcement will offer any hope of blocking discovery of the user's identity.
Security Considerations
In general security is not considered in this use case as security will be provided by the same type of credentials, token and claims as used in any secure implementation. One additional wrinkle that is inserted by a PET provider is that the PET provider must have a sufficient level of trust by the user and the relying party to perform the desired function.
- PETs may not always be our Friends. Elizabeth Renieris 2021-04-29
User Experience/Usability Considerations
One important part of any use case is the intelligibility of the use to the user. Here it is very important that the user be give only some decisions to address as can easily and comprehensibly be display on the device that used. In particular it is important that the RP have a taxonomy of requested fields that can be presented to the user within the scope of a single device page. That implies that the taxonomy of requested fields needs to be limited to those items that the user can sensibly be expected to comprehend.
A good review of the complexity that users can face when trying to control the release of data is shown in the following article: "On Facebook, Deciding Who Knows You’re a Dog" http://www.nytimes.com/2014/01/30/technology/personaltech/on-facebook-deciding-who-knows-youre-a-dog.html?ref=technology&_r=0
A deeper understanding of PET is available in chapters 10 and ll of the book "Privacy and Identity Management for Life" Springer 2011 ISBN 978-3-642-20316-9. One of the more interesting findings is that the term "Privacy Enhancing" itself is the worst understood term by users of the UX study, but "Privacy Protection" is the best understood term.
Read the report of the IDESG experience committee on use case usability at UXC Use Case Mapping
Interoperability Considerations
This process is designed to interoperate with existing SAML, JWT and other token types. Token composition is not well defined in any extant standard and needs to be addressed by the ecosystem.