User Experience Trust Metrics: Difference between revisions
Mary Hodder (talk | contribs) (UXC requirements working meeting - changes) |
(No difference)
|
Revision as of 22:40, 26 February 2015
Introduction
UX Metrics should enable measurement of the evolving baseline for participation in the Identity Ecosystem.
The requirements for this page are based on a TFTM presentation of October 2014 and are intended as input to the TFTM process. The following questions are taken from that presentation.
What is the baseline?
Improving the security, privacy, usability, and interoperability of everyday online transactions
What benefits could the everyday consumer see if this baseline was established?
e.g., reduced account compromise through increased use of multifactor authentication; greater user control through notice, consent requirements; etc.
Content
The following are the points where user experience can be measured to determine if the base line requirements are met. Note that the first metrics are for overall usability moving into the trust measurements that will indicate compliance of a particular implementation with the terms of the IDESG or of the Framework.
How are UX metrics obtained
Metrics are quantitative measures that can be tracked over time. They come from questions presented to user and from the evaluation of direct observations of users on particular sites. The Wikipedia entry on User Experience Evaluation (at this site [[1]]) defines these terms: "User experience (UX) evaluation or User experience assessment (UXA) refers to a collection of methods, skills and tools utilized to uncover how a person perceives a system (product, service, non-commercial item, or a combination of them) before, during and after interacting with it. It is non-trivial to assess user experience since user experience is subjective, context-dependent and dynamic over time."
Since the evaluation of the user is dependent on the specific UX presented to the user on a specific site, the broad measure of the success of the whole ecosystem will only be measurable when multiple sites supporting the IDESG ecosystem are widely available and questions about the overall experience are possible. In the meantime, specific implementations of web sites supporting the IDESG are encouraged. The metrics provider here should be able to act as a base set that would allow comparison between different researcher's results. A common base set of metrics would allow the UXC to use independently generated research reports in the compilation of an ecosystem wide report.
(INSERT to flesh out questions) Human Centered Possibility-driven Options-focused Expect to be wrong (portfolio approach) Iterative
Measurements (Quantitative)
- Can the user accomplish the task set out to accomplish? (A goal might be 90%, a minimum acceptable might be 60%)
- What is the System Usability Scale (John Brooke's SUS)
- It the Trustmark discoverable and self-describing. (90%, 70%)
- Does the user feel safer as a result of the appearance of the Trustmark. (99%, 80% of those answering in the affirmative above.)
- Does the user feel that the site is safe overall? (the metric is a comparisons of the positives to the negatives.)
- Does the user understand the necessity for a strong identity for their providers?
- Does the user know whether the identity of the provider is strongly bound to a real-world entity?
- Collected verbatim responses used for site improvement.
The System Usability Scale (Survey)
The SUS is a 10 item questionnaire with 5 response options.
- I think that I would like to use this system frequently.
- I found the system unnecessarily complex.
- I thought the system was easy to use.
- I think that I would need the support of a technical person to be able to use this system.
- I found the various functions in this system were well integrated.
- I thought there was too much inconsistency in this system.
- I would imagine that most people would learn to use this system very quickly.
- I found the system very cumbersome to use.
- I felt very confident using the system.
- I needed to learn a lot of things before I could get going with this system.
Guidelines of Behavior for Designing Surveys and Evaluating Results
Usability researchers should adhere to IRB requirements for the treatment of users and user collected information.
User Interviews (Qualitative Research and Ethnographic Studies)
Guidelines of Behavior for Designing Qualitative Research
Placeholder links
http://www.access-board.gov/guidelines-and-standards/communications-and-it/about-the-section-508-standards/other-resources-links
http://abledata.com/abledata.cfm?CFID=83403419&CFTOKEN=2eec2aeffb602fc3-C88E7ADB-9730-C534-AC45C274BDDACBCC
http://abledata.com/abledata.cfm?pageid=113709&ksectionid=19327
http://www.section508.gov/best-practices