Privacy Profile: Difference between revisions

From IDESG Wiki
Jump to navigation Jump to search
Line 26: Line 26:


==Problems==
==Problems==
* Privacy started and remains a realm created, defined and adjudicated by lawyers, not users. It is most interesting that even the US FTC understands the problem and has posted [https://www.ftc.gov/system/files/documents/public_comments/2018/12/ftc-2018-0098-d-0036-163372.pdf  <blockquote>The Failure of Fair Information Practice Principles] on their website. Some Quotes <blockquote><blockquote>Modern data protection law is built on “fair information practice principles” (FIPPS). At their inception in the 1970s and early 1980s, FIPPS were broad, aspirational, and included a blend of substantive (e.g., data quality, use limitation) and procedural (e.g., consent, access) principles. They reflected a wide consensus about the need for broad standards to facilitate both individual privacy and the promise of information flows in an increasingly technology-dependent, global society. As translated into national law in the United States, Europe, and elsewhere during the 1990s and 2000s, however, FIPPS have increasingly been reduced to narrow, legalistic principles (e.g., notice, choice, access, security, and enforcement). These principles reflect a procedural approach to maximizing individual control over data rather than individual or societal welfare. As theoretically appealing as this approach may be, it has proven unsuccessful in practice. Businesses and other data users are burdened with legal obligations while individuals endure an onslaught of notices and opportunities for often limited choice. Notices are frequently meaningless because individuals do not see them or choose to ignore them, they are written in either vague or overly technical language, or they present no meaningful opportunity for individual choice. Trying to enforce notices no one reads has led in the United States to the Federal Trade Commission’s tortured legal logic that such notices create enforceable legal obligations, even if they were not read or relied upon as part of the deal. Moreover, choice is often an annoyance or even a disservice to individuals. For example, the average credit report is updated four times a day in the United States. How many people want to be asked to consent each time? Yet how meaningful is consent if it must be given or withheld for all updates as a group? How meaningful is a credit reporting system if individuals can selectively choose</blockquote>
* Privacy started and remains a realm created, defined and adjudicated by lawyers, not users. It is most interesting that even the US FTC understands the problem and has posted [https://www.ftc.gov/system/files/documents/public_comments/2018/12/ftc-2018-0098-d-0036-163372.pdf  The Failure of Fair Information Practice Principles] on their website. Some Quotes <blockquote>Modern data protection law is built on “fair information practice principles” (FIPPS). At their inception in the 1970s and early 1980s, FIPPS were broad, aspirational, and included a blend of substantive (e.g., data quality, use limitation) and procedural (e.g., consent, access) principles. They reflected a wide consensus about the need for broad standards to facilitate both individual privacy and the promise of information flows in an increasingly technology-dependent, global society. As translated into national law in the United States, Europe, and elsewhere during the 1990s and 2000s, however, FIPPS have increasingly been reduced to narrow, legalistic principles (e.g., notice, choice, access, security, and enforcement). These principles reflect a procedural approach to maximizing individual control over data rather than individual or societal welfare. As theoretically appealing as this approach may be, it has proven unsuccessful in practice. Businesses and other data users are burdened with legal obligations while individuals endure an onslaught of notices and opportunities for often limited choice. Notices are frequently meaningless because individuals do not see them or choose to ignore them, they are written in either vague or overly technical language, or they present no meaningful opportunity for individual choice. Trying to enforce notices no one reads has led in the United States to the Federal Trade Commission’s tortured legal logic that such notices create enforceable legal obligations, even if they were not read or relied upon as part of the deal. Moreover, choice is often an annoyance or even a disservice to individuals. For example, the average credit report is updated four times a day in the United States. How many people want to be asked to consent each time? Yet how meaningful is consent if it must be given or withheld for all updates as a group? How meaningful is a credit reporting system if individuals can selectively choose</blockquote>


==Solutions==
==Solutions==

Revision as of 04:52, 22 April 2020

Full Title or Meme

A profile of a possible Privacy configuration as communicated from a Relying Party to a User.

Context

The OECD privacy guidelines have some good defintions and principles.

  • OECD definitions:
  1. "data controller" means a party who, according to domestic law, is competent to decide about the contents and use of personal data regardless of whether or not such data are collected, stored, processed or disseminated by that party or by an agent on its behalf;
  2. "personal data" means any information relating to an identified or identifiable individual (data subject);
  3. "transborder flows of personal data" means movements of personal data across [jurisdictional] borders.
  • PART TWO. BASIC PRINCIPLES OF NATIONAL APPLICATION
  1. Collection Limitation Principle = There should be limits to the collection of personal data and any such data should be obtained by lawful and fair means and, where appropriate, with the knowledge or consent of the data subject.
  2. Data Quality Principle = Personal data should be relevant to the purposes for which they are to be used, and, to the extent necessary for those purposes, should be accurate, complete and kept up-to-date.
  3. Purpose Specification Principle =The purposes for which personal data are collected should be specified not later than at the time of data collection and the subsequent use limited to the fulfilment of those purposes or such others as are not incompatible with those purposes and as are specified on each occasion of change of purpose.
  4. Use Limitation Principle = Personal data should not be disclosed, made available or otherwise used for purposes other than those specified in accordance with [the purpose] except:
    1. with the consent of the data subject; or
    2. by the authority of law.
  • The NIST Privacy Framework includes several resources with these functions defined (within the data controller)
  1. Identify = Develop the organizational understanding to manage privacy risk for individuals arising from data processing.
  2. Govern = Develop and implement the organizational governance structure to enable an ongoing understanding of the organization’s risk management priorities that are informed by privacy risk.
  3. Control = Develop and implement appropriate activities to enable organizations or individuals to manage data with sufficient granularity to manage privacy risks.
  4. Communicate = (Here called User Experience and Notification) Develop and implement appropriate activities to enable organizations and individuals to have a reliable understanding and engage in a dialogue about how data are processed and associated privacy risks.
  5. Protect = (Here assumed to be part of transparency.) Develop and implement appropriate data processing safeguards.

Preconditions

  • The primary use case for this profile is a user navigating to the web site of a Relying Party and deciding whether to share personal information with that website.
  • See the wiki page for Patient Choice to see more information about the different user access methods where privacy consent is established.

Problems

  • Privacy started and remains a realm created, defined and adjudicated by lawyers, not users. It is most interesting that even the US FTC understands the problem and has posted The Failure of Fair Information Practice Principles on their website. Some Quotes

    Modern data protection law is built on “fair information practice principles” (FIPPS). At their inception in the 1970s and early 1980s, FIPPS were broad, aspirational, and included a blend of substantive (e.g., data quality, use limitation) and procedural (e.g., consent, access) principles. They reflected a wide consensus about the need for broad standards to facilitate both individual privacy and the promise of information flows in an increasingly technology-dependent, global society. As translated into national law in the United States, Europe, and elsewhere during the 1990s and 2000s, however, FIPPS have increasingly been reduced to narrow, legalistic principles (e.g., notice, choice, access, security, and enforcement). These principles reflect a procedural approach to maximizing individual control over data rather than individual or societal welfare. As theoretically appealing as this approach may be, it has proven unsuccessful in practice. Businesses and other data users are burdened with legal obligations while individuals endure an onslaught of notices and opportunities for often limited choice. Notices are frequently meaningless because individuals do not see them or choose to ignore them, they are written in either vague or overly technical language, or they present no meaningful opportunity for individual choice. Trying to enforce notices no one reads has led in the United States to the Federal Trade Commission’s tortured legal logic that such notices create enforceable legal obligations, even if they were not read or relied upon as part of the deal. Moreover, choice is often an annoyance or even a disservice to individuals. For example, the average credit report is updated four times a day in the United States. How many people want to be asked to consent each time? Yet how meaningful is consent if it must be given or withheld for all updates as a group? How meaningful is a credit reporting system if individuals can selectively choose

Solutions

  1. Users can Authenticate in a manner that gives a Relying Party a consistent Identifier that can be sued from session to session without the need for sharing any User Private Information.
  2. To be fully compliant with the various Privacy legislation like the GDPR or the California legislation the Relying Party may first require that the user establish a channel back to the user for the performance of required Redress and Recovery operations.
  3. Only then should the Relying Party be in a position to request additional Attributes from the User.

User Experience

Notification

Transparency

References