This document outlines how Phil Melita, coordinator of marketing and communications at the University of Richmond SPCS, took on data mining, reporting, and analytics independently with limited resources. It describes determining the scope of data available from their customer relationship management system and Google Analytics. It discusses assessing goals around enrollment, attendance, and growth. Key performance metrics were selected around inquiries, applications, and attendance. Data was pulled from the systems and presented visually in dashboards, graphs, and regular reports to track progress towards goals and understand trends. The document emphasizes asserting one's influence within an organization to identify what data adds value and creating impactful reports on a regular schedule.
Food Chain and Food Web (Ecosystem) EVS, B. Pharmacy 1st Year, Sem-II
OK, I'll Do It Myself! Data Mining, Reporting, and Analytics on a Shoestring
1. OK, I’ll Do It Myself!
Data Mining, Reporting, and Analytics on a Shoestring
Phil Melita
Coordinator, Marketing & Communications
University of Richmond SPCS
July 28, 2015
2. Agenda
• Setting the Scene
• Determining the Scale
• Finding your Space
• Assessing the Challenge
• Picking the Criteria
• Pulling the Information
• Presenting the Data
3. Setting the Scene
Who/What/Where is SPCS
• Private, Liberal-Arts University in Virginia
• One of 5 autonomous Schools with distinct Dean,
tuition, admissions, marketing, etc.
• Degree, Non-Credit, OLLI, Summer
• Almost exclusively classroom-based
• Started with Intelliworks/Radius in 2008
4. Setting the Scene
Knee-deep in data
• Facebook, Fitbit, Apple WATCH, Statcast
• Google Analytics
• How are we doing?
• How is what you’re doing doing?
5. Determining the Scale
SPCS parameters
• 130 inquiries per month
• 70 applications per month
• 47,000 contacts in Radius
• 200 campaigns per year (+800 from comm plans)
• 230 info session attendees per year
6. Determining the Scale
SPCS history
• Rollout September 2008
• Initially 5 users, now 10
• Began with degree-program inquiry capture
• Me, Myself, and I
13. Picking the Criteria
Progress toward The Goal
• Measuring interest/responsiveness
• Seats in seats/Counting noses
(attendees, registrants, etc.)
• Conversion from stage to stage
• Determining trends
• Key Performance Metrics (KPMs)
• Measureable
• Actionable
• Predictive
14. Picking the Criteria
What does Radius let us see?
• Inquiries
• Applications
• Interactions
• Reservations
• Interest (open rates, click-throughs)
• Cumulative data or date-range analysis
19. Pulling the Information
Create data interactions
• Attendee throughput
• Started-to-Submitted window
• Comm Plan success
• Application time analysis
• Applicant analysis by term (Fall/Spring/Summer)
• Contact creation date and Campaign opens
• Conversion (inquiry-to-applicant)
• Correlations: e.g. Inquiries to Applications
20. Presenting the Data
Getting your point across
• Dashboards
• Infographics
• Graphs
• Regularly-scheduled programming
27. Presenting the Data
Applications by Month
0
50
100
150
200
250
300
Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec
APPLICATIONS STARTED
Fall Spring Summer
28. Summing it Up
Take it to Make it
• Yield to no one: assert self and your influence
• Observe your environment
• Uncover institutional goals
• Recognize what KPMs matter
• Optimize data extraction/gathering
• Create reports with impact and meaning
• Keep to a regular reporting schedule
Y O U R O C K !