際際滷

際際滷Share a Scribd company logo
Quali鍖cations
User Testing &
Design Research

January 24, 2013
Who we are

Analytic Design Group Inc (ADGi) is a visionary user experience     Founded in 2005 on the
strategy and design 鍖rm that specializes in innovating in digital   principle that evidence-based
environments by leveraging in-depth primary research to 鍖nd         design will always be more
expose unexamined assumptions. Our work not only withstands the     powerful than design driven by
                                                                    best practices, we have grown
complexity of multiple agendas and intricate implementation but
                                                                    from a single practitioner to a
also the scrutiny of the public.
                                                                    vibrant, collaborative team.

Some of our clients include:
Samsung, Sony, AT&T, Adobe, Nokia, LG, Motorola
Our Services
Key service areas include: design research, user
experience strategy development, interaction design,
communication design, and usability testing. Lately some of
our work has also included service design considerations as
well. Our projects can include the full sweep of user
experience services (i.e. user research through strategy and
design) or just one element. Our aim is to always 鍖t the work
required to the need, and well work with you to ensure you
are getting the best value from our efforts.


This presentation focuses on our design research and user
testing services.




                                                                                                  / User Experience
/ Design Research /   / Usability Testing /   / Communication Design /   / Interaction Design /
                                                                                                  Strategy Development /
Design Research
Design Research
We use a diverse set of design research methodologies:


Surveys  we have used surveys to establish baseline data
(largely attitudinal), help to segment audiences, and in
some cases, help to identify core issues that can be further
explored by other research.


Context-rich group interviews (like marketing focus groups
but much richer)  the focus groups we do, are typically
very rich and usually drive out a great deal of contextual as
well as attitudinal data. We usually ask participants to
complete homework prior to the session (aids in grounding
the user and supports contextual data gathering) as well as
have some form of participatory design exercise to allow
participants to tap into their feelings and attitudes quickly.
Design Research
On-site observation (w/o) interviews  this is useful when we are
looking for issues that are process related.


Task analysis  this is usually both an expert review and then a
walkthrough with participants to identify particular pain points with
certain tasks. This often involves both of鍖ine and online elements.      TASK 1


                                                                        80%50%
                                                                             70%
Expert review/Heuristic analysis  this can be a quick and cost-
effective means of identifying user experience and usability issues.
We typically rank severity of issues identi鍖ed and can include an
accessibility review in this process.


Card sorting  we have done card sorting exercises in both one
on one as well as group sessions. Weve used both open and closed
card sorts and typically use the 鍖ndings to develop information
architectures.


Diary Studies  are useful when we are looking at processes that
occur over a longer period of time or are looking at the impact of
certain things over time.
Design Research
In-Situ & Ethnographic

We have conducted numerous ethnographic or in-situ studies on
a wide range of physical and digital products. These typically are
very data rich and result in in-depth, tactical, near-term 鍖ndings
as well as robust, strategic, longer-term, insights. Our clients
report that the ROI on these studies is that along with 鍖nding
solutions to nagging problems, it can help them focus their
product management for a year or more.

For example, last year ADGi conducted an ethnographic study for
a mobile carrier on a device experiencing high returns. We were
able to identify key usability issues, service design issues, and
deliver insights about how their customers currently perceived
these devices and were likely to for the foreseeable future.
Design Research
Sample Report
User Testing
User Testing
The range of user testing methods we use include:


Metrics-based usability studies  the usability studies we do
are quite rich with quantitative (metrics) data as well as
qualitative data. We typically collect task time, performance,
SUS, satisfaction, and hedonic scores


Remote-moderated usability studies  through the use of
such tools as WebEx (or other screen sharing tools) we have
successfully conducted remote moderated testing, collecting
similar (or the same) metrics as we do for in person tests 
this is particularly useful when testing with participants who
are geographically dispersed or where the users context
heavily in鍖uences their interaction and on-site observation is
not possible/feasible.
User Testing
Listening-lab style user testing  this is essentially user testing
without a set task list. There is some hard data we draw out of
these sessions, but mainly this is focused on qualitative data.


Un-moderated usability testing  this is user testing where the
user is in the lab and observed and recorded but completing the
tasks on their own.


ADGi Field Test  this is a web-based tool we developed in house
that automates a 鍖eld test: participants are asked via email
whether they wish to participate. If they indicate yes, they are sent
a set of instructions or tasks to complete along with an NDA
reminder. After a set period of days participants are then sent a
survey to 鍖ll out. From a test administration point of view we can
track all the participants, where they are in the study and get a
graphical view on how they responded to each question, as well
as download a CSV of the results for additional manipulation.
Weve used this tool to test devices and apps.
User Testing
Navigation testing  this is another tool we developed in
house to test navigation structures. Users are asked a series of
questions about under what categories and labels they would
expect to 鍖nd certain pieces of information. They are shown
the tree structure for the site and can navigate through it to the
spot where they would expect to 鍖nd the content. This testing
has been very effective for us in establishing how 鍖ndable
content on very large sites will be and in determining the
effectiveness of categorization and labeling schemes.


Concept acceptance testing  this is useful for trying out a
new concept, typically while comparing it to other more familiar
ones. Weve used this on devices when a client wants to
evaluate new way of navigating or different form factor
User Testing
Competitive benchmark testing  this is useful when
comparing a product (interface, device, site) against one or
more others  we have used this to set benchmarks for future
comparison as well as just comparisons


Iterative testing  this is where we test one or at most two
discreet elements with a very small set of users (2 or 3) make
recommendations on that testing, the development team
makes those changes and we test again until we do not see
the need for any more changes. We use this method primarily
for games research looking at a particular interaction. While
other clients have asked about this, after discussing it we
have so far determined that the value of this approach does
not warrant the effort and cost for the project at hand.
User Testing
Remote user testing  Ability and experience in executing
remote usability testing  inclusive of screen sharing, audio
and video recording.


We have experience conducting remote-moderated usability
as well as focus group sessions. We screen share and
capture (audio and video record) the sessions. We have
found that this type of research can be very cost effective
and is especially useful when we are asking participants to
log in to their own accounts, or are geographically
dispersed. On occasion weve also found that by having the
user located in their own environment, we are able
to glean more contextual information than we are typically
able to in the lab.
User Testing
Sample Report
User Testing
Sample Report
Mobile Test Lab
Mobile Test Lab

Mobile test lab  we conduct a great deal of testing
on mobile devices and our lab set up is both 鍖exible
and powerful:


Our testing equipment is deliberately 鍖exible so that
we can set up in a lab environment, a coffee shop, a
persons home or of鍖ce. We have designed a very
stable, yet 鍖exible camera mount that allows us to
capture a variety of interactions. Assuming we can
connect to a stable WiFi, we can also live stream (to
allow remote viewing) outside of a lab environment.
                                                        For more information view this presentation:
                                                        Mobile Usability: What's Your Strategy
Karyn Zuidinga
Principal & Director of User Experience

604.669.7655
karyn@analyticdesigngroup.com

www.analyticdesigngroup.com
@analytic_design

More Related Content

Analytic Design Group Design Research Qualifications

  • 1. Quali鍖cations User Testing & Design Research January 24, 2013
  • 2. Who we are Analytic Design Group Inc (ADGi) is a visionary user experience Founded in 2005 on the strategy and design 鍖rm that specializes in innovating in digital principle that evidence-based environments by leveraging in-depth primary research to 鍖nd design will always be more expose unexamined assumptions. Our work not only withstands the powerful than design driven by best practices, we have grown complexity of multiple agendas and intricate implementation but from a single practitioner to a also the scrutiny of the public. vibrant, collaborative team. Some of our clients include: Samsung, Sony, AT&T, Adobe, Nokia, LG, Motorola
  • 3. Our Services Key service areas include: design research, user experience strategy development, interaction design, communication design, and usability testing. Lately some of our work has also included service design considerations as well. Our projects can include the full sweep of user experience services (i.e. user research through strategy and design) or just one element. Our aim is to always 鍖t the work required to the need, and well work with you to ensure you are getting the best value from our efforts. This presentation focuses on our design research and user testing services. / User Experience / Design Research / / Usability Testing / / Communication Design / / Interaction Design / Strategy Development /
  • 5. Design Research We use a diverse set of design research methodologies: Surveys we have used surveys to establish baseline data (largely attitudinal), help to segment audiences, and in some cases, help to identify core issues that can be further explored by other research. Context-rich group interviews (like marketing focus groups but much richer) the focus groups we do, are typically very rich and usually drive out a great deal of contextual as well as attitudinal data. We usually ask participants to complete homework prior to the session (aids in grounding the user and supports contextual data gathering) as well as have some form of participatory design exercise to allow participants to tap into their feelings and attitudes quickly.
  • 6. Design Research On-site observation (w/o) interviews this is useful when we are looking for issues that are process related. Task analysis this is usually both an expert review and then a walkthrough with participants to identify particular pain points with certain tasks. This often involves both of鍖ine and online elements. TASK 1 80%50% 70% Expert review/Heuristic analysis this can be a quick and cost- effective means of identifying user experience and usability issues. We typically rank severity of issues identi鍖ed and can include an accessibility review in this process. Card sorting we have done card sorting exercises in both one on one as well as group sessions. Weve used both open and closed card sorts and typically use the 鍖ndings to develop information architectures. Diary Studies are useful when we are looking at processes that occur over a longer period of time or are looking at the impact of certain things over time.
  • 7. Design Research In-Situ & Ethnographic We have conducted numerous ethnographic or in-situ studies on a wide range of physical and digital products. These typically are very data rich and result in in-depth, tactical, near-term 鍖ndings as well as robust, strategic, longer-term, insights. Our clients report that the ROI on these studies is that along with 鍖nding solutions to nagging problems, it can help them focus their product management for a year or more. For example, last year ADGi conducted an ethnographic study for a mobile carrier on a device experiencing high returns. We were able to identify key usability issues, service design issues, and deliver insights about how their customers currently perceived these devices and were likely to for the foreseeable future.
  • 10. User Testing The range of user testing methods we use include: Metrics-based usability studies the usability studies we do are quite rich with quantitative (metrics) data as well as qualitative data. We typically collect task time, performance, SUS, satisfaction, and hedonic scores Remote-moderated usability studies through the use of such tools as WebEx (or other screen sharing tools) we have successfully conducted remote moderated testing, collecting similar (or the same) metrics as we do for in person tests this is particularly useful when testing with participants who are geographically dispersed or where the users context heavily in鍖uences their interaction and on-site observation is not possible/feasible.
  • 11. User Testing Listening-lab style user testing this is essentially user testing without a set task list. There is some hard data we draw out of these sessions, but mainly this is focused on qualitative data. Un-moderated usability testing this is user testing where the user is in the lab and observed and recorded but completing the tasks on their own. ADGi Field Test this is a web-based tool we developed in house that automates a 鍖eld test: participants are asked via email whether they wish to participate. If they indicate yes, they are sent a set of instructions or tasks to complete along with an NDA reminder. After a set period of days participants are then sent a survey to 鍖ll out. From a test administration point of view we can track all the participants, where they are in the study and get a graphical view on how they responded to each question, as well as download a CSV of the results for additional manipulation. Weve used this tool to test devices and apps.
  • 12. User Testing Navigation testing this is another tool we developed in house to test navigation structures. Users are asked a series of questions about under what categories and labels they would expect to 鍖nd certain pieces of information. They are shown the tree structure for the site and can navigate through it to the spot where they would expect to 鍖nd the content. This testing has been very effective for us in establishing how 鍖ndable content on very large sites will be and in determining the effectiveness of categorization and labeling schemes. Concept acceptance testing this is useful for trying out a new concept, typically while comparing it to other more familiar ones. Weve used this on devices when a client wants to evaluate new way of navigating or different form factor
  • 13. User Testing Competitive benchmark testing this is useful when comparing a product (interface, device, site) against one or more others we have used this to set benchmarks for future comparison as well as just comparisons Iterative testing this is where we test one or at most two discreet elements with a very small set of users (2 or 3) make recommendations on that testing, the development team makes those changes and we test again until we do not see the need for any more changes. We use this method primarily for games research looking at a particular interaction. While other clients have asked about this, after discussing it we have so far determined that the value of this approach does not warrant the effort and cost for the project at hand.
  • 14. User Testing Remote user testing Ability and experience in executing remote usability testing inclusive of screen sharing, audio and video recording. We have experience conducting remote-moderated usability as well as focus group sessions. We screen share and capture (audio and video record) the sessions. We have found that this type of research can be very cost effective and is especially useful when we are asking participants to log in to their own accounts, or are geographically dispersed. On occasion weve also found that by having the user located in their own environment, we are able to glean more contextual information than we are typically able to in the lab.
  • 18. Mobile Test Lab Mobile test lab we conduct a great deal of testing on mobile devices and our lab set up is both 鍖exible and powerful: Our testing equipment is deliberately 鍖exible so that we can set up in a lab environment, a coffee shop, a persons home or of鍖ce. We have designed a very stable, yet 鍖exible camera mount that allows us to capture a variety of interactions. Assuming we can connect to a stable WiFi, we can also live stream (to allow remote viewing) outside of a lab environment. For more information view this presentation: Mobile Usability: What's Your Strategy
  • 19. Karyn Zuidinga Principal & Director of User Experience 604.669.7655 karyn@analyticdesigngroup.com www.analyticdesigngroup.com @analytic_design