The document provides 8 better practices for information architecture (IA) to improve findability:
1. Diagnose the important problems by analyzing usage patterns and prioritizing what matters most.
2. Balance evidence from both quantitative and qualitative research methods.
3. Advocate for the long term by developing project visions and measurable goals.
4. Measure engagement beyond conversions by tracking tasks, trust, and connections between content.
5. Support contextual navigation through content modeling and understanding user desire lines.
6. Improve search across silos by recognizing query patterns and specialized search needs.
7. Combine manual and algorithmic design approaches effectively like an onion with layered information.
8
5. Some questions that you
probably can’t answer
• Who are your content’s primary audiences?
• What are the five major tasks and needs
each has?
• Are you satisfying those tasks and needs?
• What data support your thinking?
• How do you measure success?
4
7. Why can’t we get
findability right?
• We don’t know how to diagnose
8. Why can’t we get
findability right?
• We don’t know how to diagnose
• We don’t know how to measure
9. Why can’t we get
findability right?
• We don’t know how to diagnose
• We don’t know how to measure
• Siloed organizations
10. Why can’t we get
findability right?
• We don’t know how to diagnose
• We don’t know how to measure
• Siloed organizations
• Ill-equipped decision-makers
11. Why can’t we get
findability right?
• We don’t know how to diagnose
• We don’t know how to measure
• Siloed organizations
• Ill-equipped decision-makers
• Short-term thinking
12. Why can’t we get
findability right?
• We don’t know how to diagnose
• We don’t know how to measure
• Siloed organizations
• Ill-equipped decision-makers
• Short-term thinking
• Semantic illiteracy
14. Information architecture:
8 better practices for findability
1. Diagnose the important problems
2. Balance your evidence
3. Advocate for the long term
4. Measure engagement
5. Support contextual navigation
6. Improve search across silos
7. Combine design approaches effectively
8. Tune your design over time
7
32. Balanced research
leads to true insight,
new opportunities
from Christian Rohrer: http://is.gd/95HSQ2
17
33. Lou’s TABLE OF
OVERGENERALIZED Web Analytics User Experience
DICHOTOMIES
Users' intentions and
What they Users' behaviors (what's
motives (why those things
analyze happening)
happen)
Qualitative methods for
What methods Quantitative methods to
explaining why things
they employ determine what's happening
happen
Helps users achieve goals
What they're Helps the organization meet
(expressed as tasks or
trying to achieve goals (expressed as KPI)
topics of interest)
Uncover patterns and
How they use Measure performance (goal-
surprises (emergent
data driven analysis)
analysis)
Statistical data ("real" data Descriptive data (in small
What kind of data
in large volumes, full of volumes, generated in lab
they use errors) environment, full of errors)
18
35. Develop a research regimen
to do
balanced by time, quadrant
Each week, for example...
• Analyze analytics for trends (Behavioral + Quantitative)
• Task analysis of common needs (Behavioral + Qualitative)
Each month...
• User survey (Attitudinal + Quantitative)
• Exploratory analysis of analytics data (Behavioral + Qualitative)
Each quarter...
• Field study (Behavioral/Attitudinal + Qualitative)
• Card sorting (Attitudinal + Qualitative/Quantitative)
20
42. The missing metrics
of in-betweenness
• Orientation (“What can I do here?”)
• Engagement (“I like this; do you?”)
• Connection/cross-promotion (“What goes
with this?”)
• Authority (“I trust this”)
• and many more...
26
43. Use gradual engagement
to do
model to isolate, measure tasks
Example: adoption of features; can you
measure movement between layers?
Layer 0: User visits the site (unauthenticated; no
cookies, no nothing)
Layer 1: User asks the site a question (for
example, a search query)
Layer 2: Site asks the user a question (would
you like save this product to a wish list?)
Layer 3: Site suggests something to the user
(you might enjoy these products ordered by
people like you)
Layer 4: Site acts on the user's behalf (we've
gone ahead and saved these products to your
More on gradual engagement: account's list of frequently-ordered items)
http://bit.ly/9hPqyx
27
45. Contextual navigation:
your site’s desire lines
Determine
through content
modeling, site
search analytics Deep navigation requires
content modeling :
a better approach to
deep IA and content structuring
46. Important content objects emerge
concert calendar
from content modeling (example: BBC)
album pages artist descriptions
TV listings
Content that
matters most
album reviews discography artist bios
30
49. Make content modeling a
to do
participatory design exercise
•Provide subjects with “de-oriented” samples of
content types... and common tasks
•Have them draw “desire lines” and starting
points, and identify gaps in content types
•Learn from “think out loud” and by identifying
common patterns
•More info: Atherton et al.’s “domain modeling”
presentation: http://slidesha.re/fzChQB
54. ...by contextualizing “advanced”
features, focusing on revision
search session patterns
1. solar energy
2. how solar energy works
search session patterns
1. solar energy
2. energy
55. ...by contextualizing “advanced”
features, focusing on revision
search session patterns 1. solar energy
1. solar energy 2. solar energy charts
2. how solar energy works
search session patterns
1. solar energy
2. energy
56. ...by contextualizing “advanced”
features, focusing on revision
search session patterns 1. solar energy
1. solar energy 2. solar energy charts
2. how solar energy works
search session patterns
search session patterns 1. solar energy
1. solar energy 2. explain solar energy
2. energy
57. ...by contextualizing “advanced”
features, focusing on revision
search session patterns 1. solar energy
1. solar energy 2. solar energy charts
2. how solar energy works
search session patterns
search session patterns 1. solar energy
1. solar energy 2. explain solar energy
2. energy
search session patterns
1. solar energy
2. solar energy news
76. Treat your content
to do
like an onion
information
layer usability content strategy
architecture
indexed by search
0 engine
leave it alone leave it alone
squeaky wheel issues
1 tagged by users
addressed
refresh annually
tagged by experts (non- test with a service
2 topical tags)
refresh monthly
(e.g., UserTesting.com)
tagged by experts “traditional” lab-based titled according to
3 (topical tags) user testing guidelines
content models for A/B testing structured according
4 contextual navigation to schema
43
77. Treat your content
to do
like an onion Each layer is
cumulative; most
important content is
information
layer usat thecore content strategy
architecture
indexed by search
0 engine
leave it alone leave it alone
squeaky wheel issues
1 tagged by users
addressed
refresh annually
tagged by experts (non- test with a service
2 topical tags)
refresh monthly
(e.g., UserTesting.com)
tagged by experts “traditional” lab-based titled according to
3 (topical tags) user testing guidelines
content models for A/B testing structured according
4 contextual navigation to schema
43
86. Move from time-boxed
to do
projects to ongoing processes
Example: the rolling content inventory
49
87. Summary:
8 IA better practices
1. Diagnose the important problems
2. Balance your evidence
3. Advocate for the long term
4. Measure engagement
5. Support contextual navigation
6. Improve search across silos
7. Combine design approaches effectively
8. Tune your design over time
50