Privacy Enhanced by User Agent

From IDESG Wiki
Jump to: navigation, search

Add a Comment

To add a comment, you will need to be logged on to the wiki. If you are logged on, click the button below to add a comment. The comment will be appended to the Discussion page for disposition by the reviewer. <inputbox> type=comment editintro=Comment_Instructions preload=Comment_Preload buttonlabel=Post a Comment on the Discussion Page default=Talk:Privacy Enhanced by User Agent hidden=yes </inputbox>


Use Case Metadata

Title

Privacy enhancing technology provided by an agent under the user's control.

Status

Use Case Lifecycle Status

Contributed Working Draft Committee Review Compilation Approval Publication
This use case has been approved in version 1.2. This page may have been updated since the 1.2 document was approved.

Use Case Category

Privacy, Trust/Assurance, Interoperability

Contributor

Tom Jones

Use Case Content

Use Case Description

Provide sufficient claims to a relying party to allow an online transaction to commence while limiting disclosures to those attributes that the user is willing to share with that party. A user agent is present in all digital transactions to represent a legal entity, the user, to the digital world. Enabling privacy in the digital world requires the existence of a Privacy Enhancing Technology Provider which can exist either as a part of the user agent on in some cloud service. This use case considers the former implementation. In either implementation there will be an actor that accepts claims from a variety of sources and a set of privacy policy directives from the user to craft a set of claims for the relying party that is designed specifically to meet both the requirements of the relying party and the user's privacy directives. It is important that both the user and the relying party trust the user agent. In this case a registration authority is described as the means for either to trust the user agent. As always the relying party has the final say on whether the proffered claims are adequate to allow the transaction to continue.

Actors

  • User: In this case a human being that wants to access services of a relying party and still retain privacy for details that are not needed by the RP.
  • Device Owner: An entity that can set privacy policy on the user agent residing in the user device. Note that the user will be the owner in the case of consumer devices. For enterprise owned devices the owner may have restrictions that they place on enterprise-owned data over and above user privacy concerns.
  • User Agent (UA) is a process that assembles a collection of user identities and attributes to be transmitted to an RP in accordance with user or device owner intent.
  • Identity Provider (IdP) contains identities and attributes of users.
  • Relying Party (RP): A service provider that needs a collection of claims to provide that service. The claims may relate to financial responsibility or other user attributes that are required by regulation to met legal responsibilities. It is beyond the scope of this use case to determine whether the RP actually has any justification in requesting any user attribute at all.
  • Registration Authority (RA) is a service that can register other actors; in this case the RA needs to attest to the trustworthiness of the UA.
  • Identity Ecosystem: a set of conventions for actors to exchange trusted claims. In this case the ecosystem needs to provide a taxonomy of claims requests to be sent from the RP to the UA for user decisions on which attributes to share with the RP.

Goals / User Stories

  1. Compliance with regulations for RPs and IdPs.
  2. Common method for reliably describing and reporting an individual user’s intent.
  3. High comfort level for users that they can selectively share information.

Assumptions

  1. The RP has a relatively clear set of privacy compliance regulations to follow.
  2. Standards will exist that permit the composition of claims by the UA in a format acceptable to the RP.
  3. It is possible for an RA to reliably report to an RP that a UA is trusted to reliably convey user identities and attributes only in accordance with user intent. In the case of a privacy enhanced technology provider in the cloud, the RP may be able to trust it directly.
  4. Individual users have access to a digital device upon which they can depend to host a user agent that can represent their intent in a common digital format.
  5. Registration Authorities exist and have a common protocol and taxonomy to report on UAs to RPs.
  6. Public audibility of the open standards and code of UA systems in order to check the sharing of data and identity.

Process Flow

  1. The user establishes an account with one or more IdPs. In this case there is no need to distinguish between identity providers and other attribute providers.
  2. The user accesses a web site which requires identity attributes of some sort to continue to process the user request. They then become a relying party.
  3. The RP uses a standard protocol and taxonomy to request the information needed from the user.
  4. This request for information is intercepted by an agent for the user that can:
    1. Determine if the information is available
    2. Determine if the user has already authorized release to this RP
    3. Display any remaining choices to the user to acquire more attributes or release those already available.
    4. Format the set of requested claims into a response in a way the RP can evaluate the claims.
    5. Send the response to the RP who has sole responsibility to determine if sufficient identity has been proved to provide the request access.
    6. Repeat these steps till the RP is satisfied or one side gives up.

File:Jws-Papers-Privacy.png

Success Scenario

  1. Modern devices in common use for connecting users to the internet now come with a root of trust that can be used to report on the health of the device.
  2. User agents are created on a user’s device or in the cloud that can be audited to assure that they report only identity and attribute information the user wishes to release.
  3. A small common taxonomy of user private data is established so that RPs can request information, and users can understand what information has been requested. This model works now for smart phones releasing user data to the internet because a small taxonomy of user information is reported. If the list grows long, the user experience is known to suffer as the display becomes too long for users to quickly scan before they assent. In no case should a user ever be asked for more types of information than can be displayed on a single screen with the acceptance button.
  4. The success metric should be that users are shown to be able to make intelligent choices given the displayed list of fields requested by the RA. Note that in some cases the data display to the user (e.g. date of birth) will not be the same as the claim provided to the RP (e.g. over 21). These cases are especially challenging for the user interface designer.
  5. User choices are collected by the user agent so that if the same information has been requested by the same RP in the past, the user is not continually bothered with the same questions.

Error Conditions

  1. User does not have the credentials required by the relying party. Mitigation: the relying party redirects the user to one or more sources of appropriate credentials.
  2. The user agent loses the trust of the RA and hence of the RP. Mitigation: the user must be given actionable steps to get their agents back in compliance. It should never be the case that an “unauthorized” message be passed to the user with no remediation action indicated. Recall that for this case the user agent is under user control. In cases where the privacy enhancing technology provider is in the cloud, the user is not part of the remediation process.

Relationships

References and Citations

NSTIC Guiding Principles Considerations

Privacy Considerations

Privacy enhancement is the core of the purpose of this use case. One particularly challenging problem is the case of minors under the age of 13 that are covered by COPPA. Those challenges are left for another use case.

In the following comments PII (personally identifiable information) is used in the broad sense of information that could allow linkage of an online identity to one specific carbon-based life form.


The following points address the concerns of the privacy committee as described on the discussion page:

  1. Several actors get access to user's privacy information as a part of the regular business operations. Beside the general use of care as described in any identity ecosystem agreed between the parties the following comments might help in an implementation of this use case:
    1. The Registration Authority (RA) that attests to the trustworthiness of the user agent (UA) will receive information about a piece of code that could be linked to an individual user. That makes the identity of the user agent instance PII that needs the normal protection of PII.Implementer’s should consider implementations that do not require the RA to have knowledge of all possible relying parties.
    2. . The Identity Provider (IdP) must have sufficient information to accept credentials from the user and authenticate that the user has the right to that particular identity. In a fully protected exchange the IdP should not be able to ascertain which other identity or attribute providers are accessed by the user or which RP is the source of the inquiry.
  2. The user is given the option to select that the user agent (UA) will track their connections to relying parties to reduce the number of times that they are asked to approve release of the same information to the same party. The working assumption is that RPs are reliably identified and trusted to receive the user information. As a result the UA will contain a large amount of information about where the user navigates and what information they have provide to which RPs, not unlike the current situation with cookies on the user browser. Clearly the UA needs to be trustworthy with respect to this burden.In addition, implementers should consider where the user agent is located and provide adequate controls to protect user privacy.
  3. Claims persist on the UA in the same way that cookies persist on current UAs known as browsers. It is expected that by identifying the responsibility of the UA to the user it will be possible to create compliance criteria for UA that will allow them to be both useful to the user as well as respecting the user's wishes. It is recognized that this is a tough requirement that will require years to get right.
  4. The RP can request any claim that they wish. As described in the usability section it is critical that the user be given sufficient information to evaluate the reason for the request within the stated constraint that all such UX must fit on a single page if we are to expect the user to tolerate the intrusion in their goal, which is to get access to the resources on the RP.

Security Considerations

In general security is not considered in this use case as security will be provided by the same type of credentials, token and claims as used in any secure implementation.

User Experience/Usability Considerations

One important part of any use case is the intelligibility of the choices presented to the user. Here it is very important that the user be give only some decisions to address as can easily and comprehensibly be display on the device that is used. In particular it is important that the RP have a taxonomy of requested attributes or groups of attributes for presentation to the user within the scope of a single device page. That implies that the taxonomy of requested fields needs to be limited to those items that the user can sensibly be expected to comprehend.

Interoperability Considerations

This process is designed to interoperate with existing SAML, JWT and other token types. Token composition is not well defined in any extant standard and needs to be addressed by the ecosystem.

Domain Expert Working Group Considerations

Financial

Health Care

Derived Requirements