On Monday, November 7, 2016, Smart Chicago Collaborative held the first CUTGroup Collective Community call. The goal of the CUTGroup Collective is to convene organizations and institutions in cities to help others establish new CUTGroups, create a new community, and share and learn from one another. For our first community call, we want to highlight CUTGroup Detroit’s story. Over the last few months, a collaboration across multiple entities invested in Detroit– the City of Detroit, Data Driven Detroit, and Microsoft– recruited for and conducted their first CUTGroup test. On our first call, the team involved will talk about their successes and challenges in building CUTGroup Detroit.
Slides were created by the CUTGroup Detroit team, which includes the City of Detroit, Data Driven Detroit, and Microsoft.
3. Meet the Team!
• Smart Chicago: Sonja Marziano, CUTGroup Director and Project Coordinator
• Microsoft Chicago: Shelley Stern Grach, Director of Civic Engagement, and Ivoire
Morrell, Civic Tech Fellow
• City of Detroit: Garlin Gilchrist, Detroit Director of Innovation & Emerging
Technology, and Joel Howrani-Heeres, Director of Open Data and Analysis,
Department of Innovation and Technology
• Data Driven Detroit (D3): Erica Raleigh, Executive Director, Noah Urban, Project
Lead & Senior Analyst, Kibichii Chelilim, Data Manager & Programmer, Boitshoko
Molefhi, MSU InnovateGov Summer Intern, Meghin Mather, Operations
Coordinator, and Ayana Rubio, Data Analyst
5. Establishing CUTGroup Detroit: Website Build
• Targets users of all technical
backgrouds
• Simple form allows Detroit
residents to sign up for CUTGroup
Detroit
• Captures addresses for gift card
mailings
• http://www.datadrivendetroit.org/
cutgroup/
7. Dual Recruitment Strategy: Street & Web
Street Team: Traveled to three
different Detroit communities
and distributed fliers to residents in
those communities. Mapped out
organizations to target for
recruitment, formed a team, and
assigned different roles to the team
(social media manager,
photographer).
Social media: Provided an intimate
experience of the team’s recruitment
efforts through live tweeting,
Facebook posting, and blogging.
Partnered with @D3detroit’s social
media team to promote the initiative
on social media.
12. Recruitment Results
• 110 testers after one month of recruiting
• 4 blogs on adventures in recruiting
• Prominent social media presence: 135
Twitter followers
• Location secured for first test: Ford
Research & Engagement Center
• Currently: 233 CUTGroup testers
16. Brief Recap
13 testers participated throughout the
evening in 1-on-1 proctored user tests.
For all testers & proctors this was their
very first CUTGroup test!
85% of testers most often connect to the
Internet using a smart phone.
38% of testers had never searched for
information about a commercial
property or business.
17. Brief Recap
Things that went really well
• Diverse group
• Most testers showed up in their confirmed time slot
• All testers were excited about the property tool
• Useful feedback on both mobile and desktop devices
Things that happened that we weren’t expecting
• Not a single “no-show”, which caused it to be busy at times
• One tester’s concerns with privacy led to no recording
• Proctoring on a mobile device was difficult
18. Brief Recap
• City of Detroit Non-Residential Property Tool
• Developed by Joel Howrani Heeres
• Gives Detroit residents information about the current
status of a non-residential properties including city
department ticketing, ownership, demolition, and
other essential information
19. Why use CUTGroup?
“We would like to use the CUTGroup to help us determine which aspects of
the Commercial Property tool users like and don't like and which features it doesn’t
have. The objectives for the application would be that people who like to know the
current status of a given building as regards city department ticketing, ownership,
demolition, and other processes can find out what they need to find out.”
-- Joel Howrani Heeres, Manager,
Director of Open Data and Analysis in Detroit’s
Department of Innovation and Technology
20. Capturing User Feedback
To capture feedback, a Wufoo questionnaire was formulated based on the
following areas:
• Background Questions: Questions targeted to gain a better understanding of
how the tester would use the developing application
• Usability Testing: Questions that are based around the application’s usability
(how it works, what should be changed with the user interface, features to
add/remove)
• Card Sorting: Questions to gain insight on the features and content desired
most by testers
22. First Impressions - Review
Laptop Testers – 9 total
• Several testers did not recognize Detroit City Council District boundaries, and
were confused by them.
• 4 testers said the spatial navigation was self-explanatory and felt familiar.
Mobile Testers – 4 total
• 2 testers weren’t able to load the page on the first attempt, a third switched to a
tablet from a phone because the tool was loading so slowly.
• 2 testers noted the familiarity of maps as a tool, and felt that the spatial
navigation functions were intuitive.
23. Ease of Use in Tasks
How easy was it to find a property with multiple
inspections?
5 – Very Easy 46% (6)
4 – Easy 15% (2)
3 – Neutral 15% (2)
2 – Difficult 23% (3)
1 – Very Difficult 0% (0)
How easy was it to find the the closest property to
Pinegrove Park that is scheduled for demolition?
5 – Very Easy 38% (5)
4 – Easy 15% (2)
3 – Neutral 8% (1)
2 – Difficult 23% (3)
1 – Very Difficult 15% (2)
24. Overall Review – Ease & Fit
Yes
46%
No
54%
Do you think this type of property tool
is targeted to you?
Very
Difficult
15%
Difficult
23%
Neutral
8%
Easy
15%
Very Easy
39%
Overall, how easy do you think it is
to use this property tool?
Q: Why not?
Typical A: Seems better for planners,
developers, people who work in office
environments.
26. Landing Page
“It's a little disorienting because I'm looking at the borders of what I'm guessing
are the neighborhoods and they're easy to get confused with the streets
without looking at the names…Things are kind of blending in the background
and there's just a lot of lines.”
Unfamiliarity with Detroit City Council Districts was a hurdle for several
testers. Others appreciated the familiarity of the map format and the ease with
which they could spatially navigate by using the mouse.
27. Search for a property…
Search results were inconsistently generated when…
- Testers attempted to search by keyword or business name
- Testers did not enter “Detroit, MI” at the end of a query
Testers were confused by…
- Point data: several expected to see polygons
- The frame on the right: it wasn’t clear to them that they
could click inside the frame to discover additional information
Testers appreciated…
- The amount of data available on a given property
- Street view confirmation of what they were looking at
28. Design Elements
Testers were confused by:
• Data point icons that decreased in size
when zooming in
• The color codings of various data points
• Internal navigation within the records pane
• Expectations around polygons vs points
30. Proctor Feedback
This was a new experience for
Proctors, too!
A questionnaire can be developed
for the next test, to be completed
at the end of every user test
session.
32. User-recommended updates
Make navigation more user-friendly
- A “how-to” slideshow with screenshots - Instructions/faq for searching
- Index of property-related terms - Key or Legend
Increase visual distinctions
- Delineate neighborhoods/zip codes - Include street views for all properties
- Visually mark bldgs, not data entries - Highlight property boundaries when
selected
33. Card Sorting: Data Sets of Interest
Blight Violations
11%
Building Permits
0%
Building Characteristics
14%
Building Licenses
3%
Demolitions
6%
DFC Future Land Use
5%
Environmental Testing
Results
8%
Building Inspections
11%
Insurance
5%
Ownership
19%
Priority Neighborhoods
& Commercial Corridors
5%
Tax Payment Status
8%
Zoning
5%
34. Final Report
Synthesis and analysis
of test results.
Assembling the report
has informed our
perspectives on test
protocol.
35. Next Steps
❏ Publish final report
❏ Continue building the tester pool
❏ Evaluate and plan for future tests
NOAH [5min] Ivoire: 1min intro. City of Detroit: 3min intro. D3: 1min intro “Data Driven Detroit drives informed decision-making by providing accessible, high-quality information and analysis. At D3, we bring data together in one place, organize them, keep them updated, and make public data available to all stakeholders. The data and information we provide enable better understanding of how programs can have the greatest impact, and allows them to monitor progress and understand the results of their work”. CUTGroup Detroit follows in the same vein as D3’s Summer Data Workshops, which sought to engage residents and community organizations to expand data literacy and improve the accessibility of information tools.
NOAH
NOAH [45 s] Website creation and the importance of establishing it before anything else
IVOIRE
IVOIRE [1min] Grassroots approach of recruitment, discuss how we wanted to create a positive narrative of Detroit through our social media campaign
IVOIRE [20s]
IVOIRE [20s]
IVOIRE [20s]
IVOIRE [20s]
IVOIRE [30 - 45s]
NOAH
NOAH [1-2min] First email was sent to the entire pool. Second email was sent to testers who responded to the first email.
AYANA
AYANA [1min]
AYANA [1min]
AYANA [1min]
AYANA [1min]
AYANA [1min]
AYANA
AYANA [1min]
AYANA [2min] It would be interesting to quantify ease of use. For example, AR had a tester who replied “Very Easy” despite taking a rather long time and circuitous path to find the data. Another responded with “Difficult” when they were unable to solve the task.
AYANA [< 1min]
AYANA
AYANA [1min]
AYANA [1min]
AYANA [1min]
NOAH
NOAH [5min] Live feedback from proctors during the call
NOAH
NOAH [1min]
NOAH [1min]
NOAH [2min] Some reflections: How and when do proctors prompt testers? Do we train proctors to let testers fail a task? Training on proctor transcription rules would help consistency (set norms). Greater use of multiple choice questions will speed up report writing.