SlideShare une entreprise Scribd logo
1  sur  90
Télécharger pour lire hors ligne
 
	
  
	
  

	
  

	
  	
  

	
  

	
  

	
  

	
  

	
  

	
  

Completion	
  Innovation	
  Challenge	
  Grant	
  Evaluation	
  	
  
	
  
	
  
	
  
	
  

	
  

	
  
	
  
	
  
	
  
	
  
	
  
	
  

	
  

Evaluation	
  of	
  the	
  Completion	
  Innovation	
  Challenge	
  Grant	
  
Prepared	
  by:	
  JVA	
  Consulting,	
  LLC	
  
September	
  2012	
  
Evaluation	
  of	
  the	
  Completion	
  Innovation	
  Challenge	
  Grant	
  
	
  
	
  

	
  
Table	
  of	
  Contents	
  
List	
  of	
  Figures	
  ...............................................................................................................	
  
2	
  
List	
  of	
  Tables	
  ................................................................................................................	
  
3	
  
Executive	
  Summary	
  ......................................................................................................	
  
5	
  
Methodology	
  .............................................................................................................	
  21	
  
Findings	
  .....................................................................................................................	
  25	
  
Conclusion	
  .................................................................................................................	
  53	
  
Appendix	
  A:	
  Student	
  Survey	
  .......................................................................................	
  57	
  
Appendix	
  B:	
  Faculty	
  Survey	
  ........................................................................................	
  62	
  
Appendix	
  C:	
  Faculty	
  Interview	
  Guide	
  ..........................................................................	
  85	
  
	
  

	
  

Prepared	
  by	
  JVA	
  Consulting	
  for	
  Complete	
  College	
  Colorado,	
  September	
  2012	
  

1	
  
Evaluation	
  of	
  the	
  Completion	
  Innovation	
  Challenge	
  Grant	
  
	
  
	
  
	
  

List	
  of	
  Figures	
  
Figure	
  1.	
  Gender	
  for	
  the	
  Entire	
  Sample	
  (n	
  =	
  1,527)	
  ......................................................................	
  25	
  
Figure	
  2.	
  Race	
  by	
  Group	
  for	
  the	
  Entire	
  Sample	
  (n	
  =	
  1,527)	
  ...........................................................	
  26	
  
Figure	
  3.	
  Gender	
  for	
  Survey	
  Data	
  (n	
  =	
  153)	
  ...................................................................................	
  26	
  
Figure	
  4.	
  Gender	
  by	
  Group	
  for	
  Student	
  Survey	
  Respondents	
  (n	
  =	
  153)	
  ........................................	
  27	
  
Figure	
  5.	
  Hours	
  Worked	
  Per	
  Week	
  During	
  the	
  Semester	
  for	
  Student	
  Survey	
  Respondents	
  (n	
  =	
  
153)	
  ...............................................................................................................................................	
  28	
  
Figure	
  6.	
  Relationship	
  Status	
  of	
  Survey	
  Respondents	
  (n	
  =	
  153)	
  ....................................................	
  28	
  
Figure	
  7.	
  Faculty	
  Perception	
  of	
  Open	
  Entry-­‐Exit	
  Math	
  Labs	
  Compared	
  to	
  a	
  Traditional	
  Format	
  (n	
  =	
  
7;	
  ACC	
  =	
  1,	
  PPCC	
  =	
  6,	
  TSJC	
  =	
  0)	
  ......................................................................................................	
  35	
  
Figure	
  8.	
  Faculty	
  Preference	
  for	
  the	
  Continuation	
  of	
  Open	
  Entry-­‐Exit	
  Math	
  Labs	
  (n	
  =	
  7;	
  ACC	
  =	
  1,	
  
PPCC	
  =	
  6,	
  TSJC	
  =	
  0)	
  ........................................................................................................................	
  35	
  
Figure	
  9.	
  Faculty	
  Perception	
  of	
  Accelerated	
  and	
  Compressed	
  Courses	
  Compared	
  to	
  a	
  Traditional	
  
Format	
  (n	
  =	
  5;	
  CCA	
  =	
  0,	
  CCD	
  =	
  0,	
  FRCC	
  =	
  5,	
  LCC	
  =	
  0)	
  ......................................................................	
  40	
  
Figure	
  10.	
  Faculty	
  Preference	
  for	
  the	
  Continuation	
  of	
  Accelerated	
  and	
  Compressed	
  Courses	
  (n	
  =	
  
5;	
  FRCC	
  =	
  5,	
  LCC	
  =	
  0)	
  ......................................................................................................................	
  41	
  
Figure	
  11.	
  Faculty	
  Perception	
  of	
  Modularized	
  Courses	
  With	
  Diagnostic	
  Assessments	
  Compared	
  
to	
  a	
  Traditional	
  Format	
  (n	
  =	
  3;	
  MCC	
  =	
  1,	
  NJC	
  =	
  0,	
  PCC	
  =	
  2)	
  ...........................................................	
  49	
  
Figure	
  12.	
  Faculty	
  Preference	
  for	
  the	
  Continuation	
  of	
  Modularization	
  and	
  Diagnostic	
  
Assessments	
  (n	
  =	
  3;	
  MCC	
  =	
  1,	
  NJC	
  =	
  0,	
  PCC	
  =	
  2)	
  ............................................................................	
  50	
  
	
  

	
  

Prepared	
  by	
  JVA	
  Consulting	
  for	
  Complete	
  College	
  Colorado,	
  September	
  2012	
  

2	
  
Evaluation	
  of	
  the	
  Completion	
  Innovation	
  Challenge	
  Grant	
  
	
  
	
  

List	
  of	
  Tables	
  
Table	
  1.	
  Overview	
  of	
  Math	
  Labs	
  ....................................................................................................	
  11	
  
Table	
  2.	
  Overview	
  of	
  Accelerated,	
  Compressed,	
  Contextualized	
  and	
  Mainstreaming	
  .................	
  12	
  
Table	
  3.	
  Overview	
  of	
  Online	
  Hybrid	
  Classes	
  ..................................................................................	
  13	
  
Table	
  4.	
  Overview	
  of	
  Modularization	
  and	
  Diagnostic	
  Assessments	
  ..............................................	
  14	
  
Table	
  5.	
  Percentage	
  Latino	
  and	
  Not	
  Latino	
  for	
  Entire	
  Sample	
  (n	
  =	
  1,527)	
  ....................................	
  25	
  
Table	
  6.	
  Percentage	
  Latino	
  and	
  Not	
  Latino	
  for	
  Survey	
  Data	
  (n	
  =	
  153)	
  ..........................................	
  27	
  
Table	
  7.	
  Mean	
  (SD)	
  Age,	
  Number	
  of	
  Children	
  Under	
  18	
  and	
  Number	
  of	
  Children	
  Under	
  18	
  Living	
  
with	
  Respondent	
  for	
  Student	
  Survey	
  Data	
  (n	
  =	
  153)	
  ....................................................................	
  27	
  
Table	
  8.	
  General	
  Satisfaction	
  Measures	
  (n	
  =	
  153)	
  .........................................................................	
  29	
  
Table	
  9.	
  Student	
  Perception	
  on	
  Indicators	
  of	
  Institutional	
  Quality	
  (n	
  =	
  153)	
  
................................	
  30	
  
Table	
  10.	
  Student	
  Ratings	
  of	
  Barriers	
  to	
  Retention	
  (n	
  =	
  153)	
  .......................................................	
  31	
  
Table	
  11.	
  Correlation	
  Between	
  Barriers	
  to	
  Retention	
  and	
  Course	
  Completion	
  and	
  Self-­‐Reported	
  
Expectation	
  to	
  Continue	
  College	
  (n	
  =	
  153)	
  ....................................................................................	
  32	
  
Table	
  12.	
  Comparison	
  of	
  the	
  Characteristics	
  of	
  the	
  Control	
  and	
  Innovation	
  Groups	
  for	
  Open	
  
Entry/Exit	
  Math	
  Labs	
  .....................................................................................................................	
  33	
  
Table	
  13.	
  Results	
  From	
  t-­‐Tests	
  Comparing	
  the	
  Performance	
  of	
  Control	
  Group	
  to	
  Innovation	
  
Group	
  for	
  Course	
  Completion	
  and	
  Term	
  GPA	
  for	
  Open	
  Entry/Exit	
  Math	
  Labs	
  .............................	
  34	
  
Table	
  14.	
  Process	
  Measures	
  for	
  Open	
  Entry-­‐Exit	
  Math	
  Labs	
  (n	
  =	
  7;	
  ACC	
  =	
  1,	
  PPCC	
  =	
  6,	
  TSJC	
  =	
  0)	
   36	
  
Table	
  15.	
  Overview	
  of	
  Math	
  Labs	
  ..................................................................................................	
  38	
  
Table	
  16.	
  Comparison	
  of	
  the	
  Characteristics	
  of	
  the	
  Control	
  and	
  Innovation	
  Groups	
  for	
  
Accelerated,	
  Compressed,	
  Contextualized	
  and	
  Mainstreaming	
  ...................................................	
  39	
  
Table	
  17.	
  Results	
  From	
  t-­‐Tests	
  Comparing	
  the	
  Performance	
  of	
  Control	
  Group	
  to	
  Innovation	
  
Group	
  for	
  Course	
  Completion	
  and	
  Term	
  GPA	
  for	
  Accelerated,	
  Compressed,	
  Contextualized	
  and	
  
Mainstreaming	
  ..............................................................................................................................	
  40	
  
Table	
  18.	
  Process	
  Measures	
  for	
  Accelerated,	
  Compressed,	
  Contextualized	
  and	
  Mainstreaming	
  
Courses	
  (n	
  =	
  5;	
  FRCC	
  =	
  5,	
  LCC	
  =	
  0)	
  .................................................................................................	
  41	
  
Table	
  19.	
  Overview	
  of	
  Accelerated,	
  Compressed,	
  Contextualized	
  and	
  Mainstreaming	
  ...............	
  44	
  

	
  

Prepared	
  by	
  JVA	
  Consulting	
  for	
  Complete	
  College	
  Colorado,	
  September	
  2012	
  

3	
  
Evaluation	
  of	
  the	
  Completion	
  Innovation	
  Challenge	
  Grant	
  
	
  
	
  
Table	
  20.	
  Comparison	
  of	
  the	
  Characteristics	
  of	
  the	
  Control	
  and	
  Innovation	
  Groups	
  for	
  Online	
  
Hybrid	
  Courses	
  ..............................................................................................................................	
  45	
  
Table	
  21.	
  Results	
  From	
  t-­‐Tests	
  Comparing	
  the	
  Performance	
  of	
  Control	
  Group	
  to	
  Innovation	
  
Group	
  for	
  Course	
  Completion	
  and	
  Term	
  GPA	
  for	
  Online	
  Hybrid	
  Courses	
  .....................................	
  46	
  
Table	
  22.	
  Overview	
  of	
  Online	
  Hybrid	
  Classes	
  ................................................................................	
  47	
  
Table	
  23.	
  Comparison	
  of	
  the	
  Characteristics	
  of	
  the	
  Control	
  and	
  Innovation	
  Groups	
  for	
  
Modularization	
  and	
  Diagnostic	
  Assessments	
  ................................................................................	
  48	
  
Table	
  24.	
  Results	
  From	
  t-­‐Tests	
  Comparing	
  the	
  Performance	
  of	
  Control	
  Group	
  to	
  Innovation	
  
Group	
  for	
  Course	
  Completion	
  and	
  Term	
  GPA	
  for	
  Modularization	
  and	
  Diagnostic	
  Assessments	
  ..	
  49	
  
Table	
  25.	
  Process	
  Measures	
  for	
  Modularization	
  and	
  Diagnostic	
  Assessments	
  (n	
  =	
  3;	
  MCC	
  =	
  1,	
  NJC	
  
=	
  0,	
  PCC	
  =	
  2)	
  ..................................................................................................................................	
  50	
  
Table	
  26.	
  Overview	
  of	
  Modularization	
  and	
  Diagnostic	
  Assessments	
  ............................................	
  52	
  
Table	
  27.	
  Overview	
  of	
  All	
  Innovation	
  Clusters	
  ..............................................................................	
  53	
  
	
  

	
  

Prepared	
  by	
  JVA	
  Consulting	
  for	
  Complete	
  College	
  Colorado,	
  September	
  2012	
  

4	
  
Evaluation	
  of	
  the	
  Completion	
  Innovation	
  Challenge	
  Grant	
  
	
  
	
  

Executive	
  Summary	
  
The	
  Colorado	
  Department	
  of	
  Higher	
  Education	
  (CDHE)	
  received	
  a	
  Complete	
  College	
  America	
  
(CCA)	
  grant	
  to	
  fund	
  the	
  Completion	
  Innovation	
  Challenge	
  Grant	
  (CICG)	
  project.	
  The	
  CCC	
  project	
  
is	
  operated	
  by	
  the	
  Colorado	
  Community	
  College	
  System	
  (CCCS)	
  and	
  seeks	
  to	
  improve	
  college	
  
completion	
  rates	
  within	
  CCCS	
  by	
  aligning	
  developmental	
  education	
  (DE)	
  courses	
  with	
  innovative,	
  
evidence-­‐based	
  strategies	
  (innovations)	
  and	
  by	
  initiating	
  policy	
  reforms	
  that	
  ensure	
  the	
  state	
  
financially	
  rewards	
  institutions	
  that	
  successfully	
  increase	
  the	
  number	
  of	
  college	
  graduates.	
  	
  
This	
  evaluation	
  attempts	
  to	
  answer	
  the	
  following	
  research	
  questions:	
  
n
n

n

n

Were	
  the	
  innovations	
  implemented	
  as	
  intended?	
  
What	
  can	
  the	
  colleges	
  and	
  CCCS	
  learn	
  from	
  the	
  implementation	
  of	
  the	
  seven	
  
innovations?	
  
Are	
   students	
   within	
   innovation	
   DE	
   programs	
   more	
   successful	
   (in	
   terms	
   of	
  
graduation,	
  retention	
  and	
  GPA)	
  than	
  those	
  in	
  standard	
  DE	
  programs?	
  
Which	
  innovations	
  are	
  the	
  most	
  successful	
  (in	
  terms	
  of	
  graduation,	
  retention	
  
and	
  GPA)?	
  

This	
  report	
  summarizes	
  the	
  methodology	
  of	
  this	
  evaluation	
  and	
  the	
  findings	
  to	
  date,	
  which	
  
includes	
  data	
  from	
  the	
  first	
  semester	
  of	
  implementation	
  (spring	
  2012).	
  A	
  second	
  report	
  will	
  be	
  
produced	
  in	
  August	
  of	
  2013	
  and	
  will	
  include	
  data	
  from	
  the	
  first	
  three	
  semesters	
  of	
  
implementation	
  (spring	
  2012	
  through	
  spring	
  2013).	
  	
  Evaluation	
  will	
  continue	
  beyond	
  the	
  spring	
  
of	
  2013,	
  though	
  at	
  this	
  time	
  it	
  is	
  not	
  entirely	
  clear	
  what	
  form	
  this	
  evaluation	
  will	
  take.1	
  	
  
Innovations	
  
As	
  part	
  of	
  the	
  CICG	
  project,	
  seven	
  innovations	
  in	
  developmental	
  education	
  are	
  being	
  
implemented	
  at	
  12	
  colleges	
  within	
  the	
  CCCS	
  system	
  (see	
  the	
  full	
  innovations	
  section	
  below	
  for	
  a	
  
description	
  of	
  each):	
  
n

Open	
  Entry/Exit	
  Math	
  Labs	
  

n

Mainstreaming	
  

n

Accelerated	
  and	
  Compressed	
  

n

Contextualization	
  

n

Modularization	
  

n

Diagnostic	
  Assessment	
  	
  

n

Online	
  Hybrid	
  Courses	
  for	
  Developmental	
  Education	
  

	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
   	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  
1

	
  The	
  CCA	
  grant	
  that	
  funds	
  these	
  innovations	
  and	
  their	
  evaluation	
  will	
  not	
  fund	
  third-­‐party	
  evaluation	
  beyond	
  the	
  
spring	
  of	
  2013.	
  However,	
  JVA	
  will	
  work	
  with	
  CCCS	
  to	
  ensure	
  evaluation	
  continues	
  in	
  some	
  form	
  beyond	
  this	
  time.	
  

	
  

Prepared	
  by	
  JVA	
  Consulting	
  for	
  Complete	
  College	
  Colorado,	
  September	
  2012	
  

5	
  
Evaluation	
  of	
  the	
  Completion	
  Innovation	
  Challenge	
  Grant	
  
	
  
	
  
Some	
  of	
  these	
  innovations	
  are	
  being	
  implemented	
  as	
  stand-­‐alone	
  innovations,	
  while	
  others	
  are	
  
being	
  implemented	
  in	
  combination.	
  Additionally,	
  several	
  innovations	
  closely	
  overlap	
  in	
  practice.	
  	
  
While	
  there	
  were	
  not	
  sufficient	
  data	
  available	
  to	
  robustly	
  investigate	
  each	
  institution	
  
separately,	
  this	
  study	
  investigates	
  the	
  above	
  innovations	
  in	
  four	
  distinct	
  innovation	
  clusters	
  
(see	
  the	
  full	
  innovations	
  section	
  below	
  for	
  a	
  description	
  of	
  each):	
  
n

Open	
  Entry/Exit	
  Math	
  Labs	
  

n

Accelerated,	
  Compressed,	
  Contextualized	
  and	
  Mainstreaming	
  

n

Online	
  Hybrid	
  	
  

n

Modularization	
  and	
  Diagnostic	
  Assessments	
  

Though	
  there	
  is	
  some	
  variation	
  within	
  each	
  of	
  these	
  clusters,	
  for	
  analytical	
  purposes,	
  they	
  are	
  
treated	
  as	
  distinct	
  and	
  mutually	
  exclusive	
  sets	
  of	
  innovative	
  strategies.	
  	
  The	
  institutions	
  within	
  
each	
  cluster	
  are	
  presented	
  below.	
  
Open	
  Entry/Exit	
  Math	
  Labs	
  
Three	
  institutions	
  implemented	
  open	
  entry/exit	
  math	
  labs	
  as	
  part	
  of	
  the	
  CCC	
  project:	
  
n

Arapahoe	
  Community	
  College	
  (open	
  entry/exit	
  math	
  labs)	
  

n

Pikes	
  Peak	
  Community	
  College	
  (open	
  entry	
  math	
  labs)	
  	
  

n

Trinidad	
  State	
  Junior	
  College	
  (open	
  entry/exit	
  math	
  labs)	
  

Accelerated,	
  Compressed,	
  Contextualization	
  and	
  Mainstreaming	
  
Four	
  institutions	
  implemented	
  accelerated,	
  compressed,	
  contextualized	
  and/or	
  mainstreaming	
  
efforts	
  as	
  part	
  of	
  the	
  CCC	
  project:	
  
n
n

Community	
  College	
  of	
  Aurora	
  (accelerated,	
  compressed	
  and	
  mainstreaming)	
  
Community	
   College	
   of	
   Denver	
   (accelerated,	
   compressed,	
   mainstreaming	
   and	
  
contextualized)	
  

n

Front	
  Range	
  Community	
  College	
  (accelerated	
  and	
  compressed)	
  	
  

n

Lamar	
  Community	
  College	
  (accelerated	
  and	
  compressed)	
  

Online	
  Hybrid	
  Courses	
  
Two	
  institutions	
  implemented	
  online	
  hybrid	
  courses	
  as	
  part	
  of	
  the	
  CCC	
  project:	
  
n
n

	
  

Colorado	
  Community	
  College	
  Online	
  (online	
  hybrid	
  courses)	
  	
  
Otero	
  Junior	
  College	
  (online	
  hybrid	
  courses)	
  

Prepared	
  by	
  JVA	
  Consulting	
  for	
  Complete	
  College	
  Colorado,	
  September	
  2012	
  

6	
  
Evaluation	
  of	
  the	
  Completion	
  Innovation	
  Challenge	
  Grant	
  
	
  
	
  
Modularization	
  and	
  Diagnostic	
  Assessments	
  
Three	
  institutions	
  implemented	
  modularization	
  and	
  diagnostic	
  assessments	
  as	
  part	
  of	
  the	
  CCC	
  
project:	
  
n

Morgan	
  Community	
  College	
  (diagnostic	
  assessments	
  and	
  math	
  mods)	
  	
  

n

Northeastern	
  Junior	
  College	
  (diagnostic	
  assessments	
  and	
  math	
  mods)	
  

n

Pueblo	
  Community	
  College	
  (diagnostic	
  assessments	
  and	
  math	
  mods)	
  

Methodology	
  
To	
  answer	
  the	
  research	
  questions,	
  a	
  survey	
  was	
  administered	
  to	
  students	
  to	
  support	
  
institutional	
  data	
  derived	
  from	
  the	
  Student	
  Unit	
  Record	
  Data	
  System	
  (SURDS).	
  	
  Additionally,	
  a	
  
faculty	
  survey	
  was	
  administered	
  and	
  interviews	
  were	
  conducted	
  with	
  key	
  faculty	
  members.	
  The	
  
sections	
  below	
  discuss	
  each	
  of	
  these	
  data	
  sources	
  in	
  more	
  detail,	
  as	
  well	
  as	
  how	
  the	
  control	
  
groups	
  were	
  constructed	
  and	
  the	
  limitations	
  of	
  this	
  evaluation.	
  
Data	
  Sources	
  
The	
  data	
  used	
  in	
  this	
  study	
  were	
  gathered	
  from	
  four	
  sources:	
  
n

n

n

n

CCCS	
   institutional	
   data—demographics,	
  grades	
  and	
  course	
  completion	
  variables	
  
from	
  Student	
  Unit	
  Record	
  Data	
  System	
  (SURDS).	
  
Student	
  survey—an	
   electronic	
   survey	
   designed	
   to	
   ascertain	
   student	
   satisfaction	
  
with	
   DE	
   programming	
   and	
   to	
   identify	
   challenges	
   DE	
   students	
   experience	
   that	
  
may	
  act	
  as	
  barriers	
  to	
  graduation	
  (see	
  Appendix	
  A).	
  
Faculty	
  survey—an	
  electronic	
  survey	
  designed	
  to	
  ascertain	
  the	
  degree	
  to	
  which	
  
faculty/staff	
   members	
   feel	
   each	
   innovation	
   is	
   being	
   implemented	
   as	
   intended	
  
and	
  faculty	
  perception	
  of	
  the	
  quality	
  of	
  the	
  innovations	
  (see	
  Appendix	
  B).	
  
Faculty	
   interviews—phone	
   interviews	
   lasting	
   approximately	
   15–30	
   minutes	
  
with	
   16	
   key	
   faculty	
   and	
   staff	
   members	
   to	
   ascertain	
   the	
   degree	
   to	
   which	
   each	
  
innovation	
  is	
  being	
  implemented	
  as	
  intended,	
  what	
  is	
  going	
  well	
  and	
  what	
  could	
  
be	
  improved	
  upon	
  (see	
  Appendix	
  C).	
  

Control	
  Group	
  
To	
  build	
  control	
  groups,	
  students	
  in	
  traditional-­‐format	
  DE	
  courses	
  were	
  identified	
  and	
  matched	
  
by	
  institution	
  and	
  course—for	
  each	
  innovation	
  course,	
  a	
  corresponding	
  traditional	
  course	
  at	
  the	
  
same	
  institution	
  was	
  identified.	
  When	
  this	
  was	
  not	
  possible,	
  a	
  course	
  at	
  a	
  similar	
  institution	
  
(similar	
  in	
  terms	
  of	
  size	
  and	
  rural/urban	
  location)	
  was	
  identified.	
  This	
  process	
  ensured	
  that,	
  
whenever	
  possible,	
  innovation	
  courses	
  were	
  matched	
  to	
  control	
  courses	
  at	
  the	
  same	
  institution.	
  
As	
  such,	
  institutionally	
  specific	
  variables	
  were	
  controlled	
  as	
  much	
  as	
  possible.	
  Finally,	
  within	
  
each	
  innovation	
  cluster,	
  control	
  groups	
  were	
  matched	
  to	
  the	
  innovation	
  groups	
  along	
  four	
  

	
  

Prepared	
  by	
  JVA	
  Consulting	
  for	
  Complete	
  College	
  Colorado,	
  September	
  2012	
  

7	
  
Evaluation	
  of	
  the	
  Completion	
  Innovation	
  Challenge	
  Grant	
  
	
  
	
  
demographic	
  factors:	
  (1)	
  gender,	
  (2)	
  ethnicity,	
  (3)	
  race2	
  and	
  (4)	
  age.	
  In	
  the	
  findings	
  section	
  
below,	
  the	
  relative	
  match	
  between	
  control	
  and	
  innovation	
  groups	
  is	
  identified	
  for	
  each	
  
innovation	
  cluster.	
  
Study	
  Limitations	
  
Though	
  this	
  evaluation	
  provides	
  valuable	
  information	
  on	
  the	
  CCC	
  program,	
  it	
  suffers	
  from	
  some	
  
limitations:	
  	
  
n

n

A	
   mismatch	
   between	
   the	
   time	
   horizon	
   of	
   the	
   study	
   and	
   the	
   desired	
   outcomes—
college	
  retention	
  is	
  a	
  long-­‐term	
  measure	
  that	
  will	
  most	
   effectively	
   be	
   measured	
  
over	
  a	
  longer	
  period	
  of	
  time.	
  	
  
The	
   ambiguity	
   contained	
   within	
   definitions	
   of	
   these	
   innovations—institutions	
  
define	
  and	
  implement	
  the	
  same	
  innovations	
  somewhat	
  differently.	
  	
  

n

An	
  inability	
  to	
  make	
  distinctions	
  between	
  similar	
  innovations	
  within	
  clusters.	
  	
  

n

Generally	
  small	
  sample	
  sizes	
  limit	
  the	
  generalizability	
  of	
  these	
  findings.	
  	
  

n

Not	
  all	
  of	
  the	
  potential	
  benefits	
  associated	
  with	
  these	
  innovations	
  are	
  measured	
  
by	
  this	
  evaluation.	
  	
  

Despite	
  these	
  limitations,	
  this	
  evaluation	
  provides	
  valuable	
  information	
  on	
  the	
  progress	
  made	
  
by	
  the	
  CICG	
  project.	
  	
  Though	
  these	
  findings	
  cannot	
  be	
  considered	
  conclusive,	
  they	
  do	
  provide	
  a	
  
sense	
  of	
  how	
  the	
  project	
  has	
  progressed	
  and	
  what	
  it	
  has	
  accomplished	
  thus	
  far.	
  

Findings	
  
Findings	
  are	
  presented	
  in	
  six	
  sections	
  below:	
  (1)	
  student	
  demographics,	
  (2)	
  student	
  experience,	
  
(3)	
  math	
  lab	
  innovation	
  cluster,	
  (4)	
  accelerated,	
  compressed,	
  contextualized	
  and	
  mainstreaming	
  
innovation	
  cluster,	
  (5)	
  online	
  hybrid	
  innovation	
  cluster,	
  and	
  (6)	
  modularization	
  and	
  diagnostic	
  
assessment	
  innovation	
  cluster.	
  
Student	
  Demographics	
  
Student	
  demographic	
  data	
  for	
  this	
  study	
  are	
  from	
  two	
  sources:	
  (1)	
  institutional	
  data	
  and	
  (2)	
  the	
  
student	
  survey.	
  Data	
  from	
  each	
  of	
  these	
  sources	
  are	
  presented	
  below:	
  
n

n

n

Gender—more	
  than	
  half	
  (55%)	
  of	
  the	
  entire	
  sample	
  is	
  female	
  and	
  just	
  over	
  two-­‐
thirds	
  (70%)	
  of	
  survey	
  respondents	
  are	
  female.	
  
Ethnicity—roughly	
   one-­‐fifth	
   (20.3%)	
   of	
   the	
   entire	
   sample	
   identifies	
   as	
   Latino,	
   as	
  
did	
  a	
  slightly	
  smaller	
  proportion	
  of	
  survey	
  respondents	
  (17.0%).	
  
Race—almost	
   three-­‐fifths	
   (58%)	
   of	
   the	
   entire	
   sample	
   identifies	
   as	
   white,	
   and	
  
just	
  over	
  one-­‐fifth	
  (22%)	
  did	
  not	
  identify	
  as	
  any	
  of	
  the	
  available	
  racial	
  categories.	
  	
  

	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
   	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  
2

	
  In	
  these	
  data,	
  ethnicity	
  is	
  treated	
  as	
  a	
  separate	
  concept	
  from	
  race.	
  Ethnicity	
  consists	
  of	
  Latino/non-­‐Latino	
  and	
  race	
  
consists	
  of	
  five	
  separate	
  racial	
  categories.	
  

	
  

Prepared	
  by	
  JVA	
  Consulting	
  for	
  Complete	
  College	
  Colorado,	
  September	
  2012	
  

8	
  
Evaluation	
  of	
  the	
  Completion	
  Innovation	
  Challenge	
  Grant	
  
	
  
	
  
Similarly,	
   almost	
   two-­‐thirds	
   (65%)	
   of	
   survey	
   respondents	
   identify	
   as	
   white,	
  
which	
  is	
  higher	
  than	
  the	
  sample	
  as	
  a	
  whole.	
  Additionally,	
  16%	
  did	
  not	
  identify	
  as	
  
any	
  of	
  the	
  available	
  racial	
  categories,	
  which	
  is	
  lower	
  than	
  the	
  sample	
  as	
  a	
  whole.	
  	
  
n

Age—the	
   mean	
   age	
   for	
   the	
   sample	
   as	
   a	
   whole	
   is	
   28.06	
   (SD	
   =	
   9.676),	
   ranging	
  
from	
  17	
  years	
  old	
  to	
  72	
  years	
  old.	
  Survey	
  respondents	
  are	
  slightly	
  older	
  with	
  a	
  
mean	
  age	
  of	
  31	
  (SD	
  =	
  11.256).	
  

Student	
  Experience	
  
A	
  student	
  survey	
  was	
  administered	
  to	
  get	
  a	
  sense	
  of	
  the	
  student	
  experience,	
  including	
  student	
  
satisfaction	
  with	
  DE	
  programming	
  and	
  challenges	
  DE	
  students	
  experience	
  that	
  may	
  act	
  as	
  
barriers	
  to	
  graduation.	
  	
  Though	
  these	
  results	
  contain	
  useful	
  findings,	
  the	
  sample	
  is	
  too	
  small	
  to	
  
be	
  confident	
  that	
  it	
  is	
  fully	
  representative	
  of	
  all	
  the	
  students	
  in	
  this	
  study.3	
  	
  As	
  such,	
  extreme	
  
caution	
  should	
  be	
  taken	
  when	
  reading	
  these	
  results,	
  as	
  they	
  may	
  not	
  generalizable	
  to	
  the	
  
population	
  at-­‐large	
  (i.e.	
  all	
  students	
  in	
  the	
  study).	
  
The	
  student	
  survey	
  suggests	
  satisfaction	
  is	
  relatively	
  high	
  among	
  CCCS	
  students,	
  with	
  just	
  over	
  
four-­‐fifths	
  (81.6%)	
  of	
  survey	
  respondents	
  indicating	
  they	
  were	
  either	
  satisfied	
  or	
  very	
  satisfied	
  
with	
  their	
  college	
  experience.	
  Additionally,	
  95.2%	
  of	
  survey	
  respondents	
  indicated	
  that	
  their	
  
college	
  experience	
  met	
  or	
  exceeded	
  their	
  expectations	
  and	
  almost	
  two-­‐thirds	
  (72.6%)	
  indicated	
  
that	
  they	
  plan	
  on	
  graduating	
  from	
  the	
  college	
  they	
  are	
  attending,	
  while	
  just	
  over	
  half	
  (53.5%)	
  
indicated	
  that	
  they	
  plan	
  on	
  transferring	
  to	
  a	
  different	
  college.	
  When	
  results	
  from	
  these	
  two	
  
questions	
  are	
  combined,	
  the	
  data	
  show	
  that	
  91.6%	
  of	
  respondents	
  indicated	
  that	
  they	
  either	
  
plan	
  on	
  graduating	
  from	
  the	
  college	
  they	
  are	
  in,	
  and/or	
  they	
  plan	
  on	
  transferring	
  to	
  a	
  different	
  
college.	
  Thus,	
  at	
  this	
  point,	
  8.4%	
  of	
  survey	
  respondents	
  do	
  not	
  anticipate	
  progressing	
  through	
  
the	
  system	
  to	
  degree	
  completion.	
  
In	
  addition	
  to	
  the	
  satisfaction	
  measures	
  addressed	
  above,	
  students	
  were	
  asked	
  to	
  agree	
  or	
  
disagree	
  with	
  a	
  set	
  of	
  statements	
  related	
  to	
  institutional	
  quality.	
  These	
  data	
  suggest	
  that	
  
student	
  perception	
  of	
  institutional	
  quality	
  is	
  generally	
  high.	
  Indeed,	
  on	
  a	
  five-­‐point	
  Likert-­‐type	
  
scale	
  where	
  1	
  =	
  “Strongly	
  disagree”	
  and	
  5	
  =	
  “Strongly	
  agree,”	
  for	
  all	
  but	
  three	
  items,	
  mean	
  
scores	
  were	
  above	
  4	
  (or	
  Agree)	
  and	
  more	
  than	
  80%	
  of	
  respondents	
  agreed	
  or	
  strongly	
  agreed	
  
with	
  the	
  statements.	
  Further,	
  the	
  remaining	
  items	
  had	
  mean	
  scores	
  above	
  3	
  (or	
  the	
  neutral	
  
point)	
  indicating	
  more	
  agreement	
  than	
  disagreement.	
  	
  
The	
  student	
  survey	
  also	
  asked	
  students	
  to	
  indicate	
  the	
  extent	
  to	
  which	
  certain	
  circumstances	
  
were	
  barriers	
  to	
  their	
  ability	
  and/or	
  willingness	
  to	
  attend	
  school	
  next	
  semester.	
  Responses	
  were	
  
	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
   	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  
3

	
  The	
  margin	
  of	
  error	
  for	
  this	
  sample	
  (153	
  from	
  a	
  population	
  of	
  1,527)	
  is	
  7.52%	
  at	
  a	
  95%	
  
confidence	
  level.	
  	
  To	
  attain	
  a	
  more	
  generally	
  acceptable	
  margin	
  of	
  error	
  of	
  5%	
  while	
  retaining	
  a	
  
95%	
  confidence	
  level,	
  a	
  sample	
  of	
  308	
  would	
  have	
  been	
  needed.	
  

	
  

Prepared	
  by	
  JVA	
  Consulting	
  for	
  Complete	
  College	
  Colorado,	
  September	
  2012	
  

9	
  
Evaluation	
  of	
  the	
  Completion	
  Innovation	
  Challenge	
  Grant	
  
	
  
	
  
on	
  a	
  four-­‐point	
  Likert-­‐type	
  scale	
  where	
  1	
  =	
  “Not	
  a	
  barrier,”	
  2	
  =	
  “Somewhat	
  of	
  a	
  barrier,”	
  3	
  =	
  
“Moderate	
  barrier”	
  and	
  4	
  =	
  “Extreme	
  barrier.”	
  As	
  demonstrated	
  by	
  the	
  mean	
  scores	
  (with	
  only	
  
one	
  item	
  exceeding	
  a	
  mean	
  score	
  of	
  2,	
  or	
  somewhat	
  of	
  a	
  barrier),	
  respondents	
  do	
  not	
  seem	
  see	
  
these	
  items	
  as	
  overwhelming	
  barriers	
  to	
  their	
  ability	
  to	
  continue	
  with	
  school	
  next	
  semester.	
  	
  	
  
Additionally,	
  correlations4	
  were	
  run	
  with	
  these	
  barriers	
  and	
  both	
  the	
  course	
  completion	
  ratio	
  
(ratio	
  of	
  DE	
  courses	
  passed	
  over	
  those	
  attempted)	
  and	
  self-­‐reported	
  continuance	
  (respondent	
  
indicating	
  either	
  an	
  intent	
  to	
  graduate	
  and/or	
  transfer	
  to	
  other	
  school).	
  These	
  data	
  indicated	
  
that	
  there	
  is	
  no	
  correlation	
  between	
  a	
  student’s	
  perception	
  of	
  each	
  barrier	
  and	
  whether	
  or	
  not	
  
he	
  or	
  she	
  expects	
  to	
  graduate	
  or	
  transfer	
  to	
  another	
  college.	
  However,	
  there	
  are	
  correlations	
  
between	
  student	
  perception	
  of	
  barriers	
  and	
  their	
  course	
  completion	
  ratio.	
  In	
  particular,	
  the	
  
following	
  barriers	
  are	
  significantly	
  negatively	
  correlated	
  with	
  course	
  completion:	
  
n

Amount	
  of	
  time	
  required	
  	
  

n

Difficulty	
  of	
  the	
  classes	
  	
  

n

Navigating	
  the	
  administration	
  	
  

n

The	
  lack	
  of	
  a	
  social	
  scene	
  	
  

n

The	
  school’s	
  fit	
  with	
  my	
  academic	
  needs	
  	
  

n

Cost	
  of	
  school	
  	
  

In	
  other	
  words,	
  as	
  student	
  perception	
  of	
  each	
  of	
  the	
  above	
  barriers	
  rises,	
  the	
  likelihood	
  that	
  he	
  
or	
  she	
  passes	
  his	
  or	
  her	
  DE	
  courses	
  drops.	
  Yet,	
  there	
  is	
  no	
  such	
  correlation	
  between	
  student	
  
perception	
  of	
  these	
  barriers	
  and	
  their	
  self-­‐reported	
  expectation	
  to	
  continue	
  with	
  college.	
  This	
  
suggests	
  that	
  all	
  of	
  the	
  barriers	
  listed	
  in	
  the	
  bullet	
  points	
  above	
  impact	
  student	
  performance	
  
(as	
  measured	
  by	
  DE	
  course	
  completion),	
  but	
  that	
  the	
  barriers	
  do	
  not	
  impact	
  student	
  
expectations	
  regarding	
  graduation	
  or	
  transfer.	
  
Open	
  Entry/Exit	
  Math	
  Labs	
  
(ACC,	
  PPCC	
  and	
  TSJC)	
  
Below	
  (Table	
  1)	
  is	
  a	
  summary	
  of	
  findings	
  for	
  the	
  math	
  lab	
  innovation	
  cluster	
  (for	
  more	
  complete	
  
findings,	
  see	
  the	
  full	
  Open	
  Entry/Exit	
  Math	
  Labs	
  section	
  below).	
  

	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
   	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  
4

	
  The	
  Pearson	
  product-­‐moment	
  correlation	
  coefficient	
  is	
  a	
  measure	
  of	
  the	
  relationship	
  between	
  two	
  variables;	
  in	
  
other	
  words,	
  a	
  measure	
  of	
  the	
  tendency	
  of	
  the	
  variables	
  to	
  increase	
  or	
  decrease	
  together.	
  	
  

	
  

Prepared	
  by	
  JVA	
  Consulting	
  for	
  Complete	
  College	
  Colorado,	
  September	
  2012	
  

10	
  
Evaluation	
  of	
  the	
  Completion	
  Innovation	
  Challenge	
  Grant	
  
	
  
	
  
	
  
Table	
  1.	
  Overview	
  of	
  Math	
  Labs	
  
Item	
  

Performance	
  

Course	
  Completion	
  Over	
  Control	
  

Significantly	
  Lower	
  

Term	
  GPA	
  Over	
  Control	
  
Perception	
  of	
  Innovation	
  Quality	
  (Faculty)	
  

No	
  Significant	
  Difference	
  
About	
  the	
  Same	
  

Desire	
  to	
  Continue	
  Innovation	
  (Faculty)	
  

Yes	
  

Implemented	
  as	
  Intended	
  (Faculty	
  Perception)	
  

Yes	
  

Key	
  Contextual	
  Notes	
  

	
  

Positive	
  Developments	
  in	
  Implementation	
  
• Increases	
  flexibility	
  for	
  students	
  	
  
• Allows	
  appropriate	
  pace	
  (not	
  necessarily	
  faster)	
  	
  
• Mastery	
  of	
  the	
  subject	
  matter	
  (not	
  just	
  pass)	
  
• More	
  friendly	
  for	
  some	
  older	
  students	
  	
  
• Reduces	
  point-­‐in-­‐time	
  student-­‐to-­‐teacher	
  ratios	
  	
  
Ongoing	
  Challenges	
  in	
  Implementation	
  
• Different	
  facility	
  requirements	
  	
  
• Increased	
  administrative	
  complexity	
  	
  
• Increased	
  complexity	
  for	
  instructors	
  	
  
• Insufficient	
  time	
  management	
  (on	
  the	
  part	
  of	
  students)	
  	
  
• “Appropriate	
  pace”	
  ≠	
  faster	
  	
  
Start-­‐Up	
  Growing	
  Pains	
  
• Messaging	
  issues	
  	
  
• Insufficient	
  training	
  	
  
	
  
Accelerated,	
  Compressed,	
  Contextualization	
  and	
  Mainstreaming	
  
(CCA,	
  CCD,	
  FRCC	
  and	
  LCC)	
  
Table	
  2	
  below	
  summarizes	
  the	
  findings	
  for	
  the	
  accelerated,	
  compressed,	
  contextualized	
  and	
  
mainstreaming	
  innovation	
  cluster	
  (for	
  more	
  complete	
  findings,	
  see	
  the	
  full	
  Accelerated,	
  
Compressed,	
  Contextualization	
  and	
  Mainstreaming	
  section	
  below).	
  

	
  

Prepared	
  by	
  JVA	
  Consulting	
  for	
  Complete	
  College	
  Colorado,	
  September	
  2012	
  

11	
  
Evaluation	
  of	
  the	
  Completion	
  Innovation	
  Challenge	
  Grant	
  
	
  
	
  
Table	
  2.	
  Overview	
  of	
  Accelerated,	
  Compressed,	
  Contextualized	
  and	
  Mainstreaming	
  
Item	
  
Performance	
  
Course	
  Completion	
  Over	
  Control	
  
Term	
  GPA	
  Over	
  Control	
  

No	
  Significant	
  Difference	
  
Significantly	
  Higher	
  

Perception	
  of	
  Innovation	
  Quality	
  (Faculty)	
  

Better	
  

Desire	
  to	
  Continue	
  Innovation	
  (Faculty)	
  

Yes	
  

Implemented	
  as	
  Intended	
  (Faculty	
  Perception)	
  

Yes	
  

Key	
  Contextual	
  Notes	
  

	
  

Positive	
  Developments	
  in	
  Implementation	
  
• Allows	
  students	
  to	
  progress	
  more	
  quickly	
  	
  
• Positively	
  impacts	
  student	
  motivation	
  	
  
• Contributes	
  to	
  an	
  improved	
  academic	
  culture	
  	
  
• Increases	
  student	
  autonomy	
  	
  
• Increases	
  curriculum	
  relevance	
  	
  
• Increases	
  student	
  engagement	
  	
  
• Facilitates	
  learning	
  across	
  subjects	
  	
  
Ongoing	
  Challenges	
  in	
  Implementation	
  
• Students’	
  lack	
  of	
  desire	
  to	
  go	
  faster	
  	
  
• Students’	
  lack	
  of	
  ability	
  	
  
• Complexity	
  of	
  administrative	
  logistics	
  	
  
• Less	
  room	
  to	
  adjust	
  to	
  unforeseen	
  issues	
  	
  
• Finding	
  the	
  appropriate	
  pace	
  	
  
• Students’	
  need	
  for	
  additional	
  support	
  	
  
• Occasional	
  tension	
  between	
  contextual	
  projects	
  and	
  basic	
  content	
  
Start-­‐Up	
  Growing	
  Pains	
  
• Messaging	
  issues	
  	
  
• Insufficient	
  training	
  	
  
• Time	
  constraints	
  	
  
	
  
Online	
  Hybrid	
  Courses	
  
(CCCOnline	
  and	
  OJC)	
  
Table	
  3	
  below	
  summarizes	
  the	
  findings	
  for	
  the	
  online	
  hybrid	
  innovation	
  cluster	
  (for	
  more	
  
complete	
  findings,	
  see	
  the	
  full	
  Online	
  Hybrid	
  Courses	
  section	
  below).	
  

	
  

Prepared	
  by	
  JVA	
  Consulting	
  for	
  Complete	
  College	
  Colorado,	
  September	
  2012	
  

12	
  
Evaluation	
  of	
  the	
  Completion	
  Innovation	
  Challenge	
  Grant	
  
	
  
	
  
Table	
  3.	
  Overview	
  of	
  Online	
  Hybrid	
  Classes	
  
Item	
  

Performance	
  

Course	
  Completion	
  Over	
  Control	
  

No	
  Significant	
  Difference	
  

Term	
  GPA	
  Over	
  Control	
  

No	
  Significant	
  Difference	
  

Perception	
  of	
  Innovation	
  Quality	
  (Faculty)	
  

No	
  Data	
  

Desire	
  to	
  Continue	
  Innovation	
  (Faculty)	
  

No	
  Data	
  

Implemented	
  as	
  Intended	
  (Faculty	
  Perception)	
  

No	
  Data	
  

Key	
  Contextual	
  Notes	
  
Positive	
  Developments	
  in	
  Implementation	
  
• Adds	
  a	
  “personal	
  touch”	
  to	
  online	
  courses	
  	
  
• Expands	
  tutoring	
  within	
  CCCOnline	
  	
  
• Awareness	
  was	
  established	
  	
  
• Access	
  was	
  provided	
  	
  
Start-­‐Up	
  Growing	
  Pains	
  
• Insufficient	
  program	
  definition	
  	
  
• Messaging	
  issues	
  	
  
• Lack	
  of	
  integration	
  	
  
• OJCs	
  largely	
  not	
  utilized	
  	
  
	
  

	
  

	
  

Modularization	
  and	
  Diagnostic	
  Assessments	
  
(MCC,	
  NJC	
  and	
  PCC)	
  
Table	
  4	
  below	
  summarizes	
  the	
  findings	
  for	
  the	
  modularization	
  and	
  diagnostic	
  assessments	
  
innovation	
  cluster	
  (for	
  more	
  complete	
  findings,	
  see	
  the	
  full	
  Modularization	
  and	
  Diagnostic	
  
Assessments	
  section	
  below).	
  

	
  

Prepared	
  by	
  JVA	
  Consulting	
  for	
  Complete	
  College	
  Colorado,	
  September	
  2012	
  

13	
  
Evaluation	
  of	
  the	
  Completion	
  Innovation	
  Challenge	
  Grant	
  
	
  
	
  
Table	
  4.	
  Overview	
  of	
  Modularization	
  and	
  Diagnostic	
  Assessments	
  
Item	
  

Performance	
  

Course	
  Completion	
  Over	
  Control	
  

No	
  Significant	
  Difference	
  

Term	
  GPA	
  Over	
  Control	
  

No	
  Significant	
  Difference	
  

Perception	
  of	
  Innovation	
  Quality	
  (Faculty)	
  

Better	
  

Desire	
  to	
  Continue	
  Innovation	
  (Faculty)	
  

Yes	
  

Implemented	
  as	
  Intended	
  (Faculty	
  Perception)	
  

Yes	
  

Key	
  Contextual	
  Notes	
  

	
  

Positive	
  Developments	
  in	
  Implementation	
  
• Appropriate	
  pace	
  	
  
• Mastery	
  of	
  the	
  subject	
  matter	
  	
  
• Shorter	
  remediation	
  track	
  	
  
• Instant	
  feedback	
  	
  
• Appropriate	
  placement	
  	
  
Challenges	
  in	
  Implementation	
  
• Increased	
  administrative	
  complexity	
  	
  
• Perception	
  that	
  students	
  are	
  “teaching	
  themselves”	
  	
  
• Lack	
  of	
  computer	
  skills	
  	
  
• Diagnostic	
  testing	
  ≠	
  shorter	
  remediation	
  track	
  	
  
Start-­‐Up	
  Growing	
  Pains	
  
• Messaging	
  issues	
  	
  

Conclusion	
  in	
  Executive	
  Summary	
  
These	
  data	
  go	
  some	
  distance	
  in	
  answering	
  outcome	
  related	
  research	
  questions:	
  
•

Are	
  students	
  within	
  innovation	
  DE	
  programs	
  more	
  successful	
  (in	
  terms	
  of	
  graduation,	
  
retention	
  and	
  GPA)	
  than	
  those	
  in	
  standard	
  DE	
  programs?	
  
It	
  is	
  premature	
  to	
  fully	
  answer	
  this	
  question,	
  but	
  thus	
  far	
  there	
  is	
  not	
  strong	
  evidence	
  
to	
  suggest	
  that	
  innovation	
  formats	
  are	
  outperforming	
  traditional	
  formats	
  in	
  terms	
  of	
  
retention	
  and	
  GPA.	
  This	
  is	
  not	
  entirely	
  surprising	
  as	
  these	
  measures	
  are	
  largely	
  long-­‐
term	
  measures,	
  and	
  CCCS	
  institutions	
  are	
  still	
  in	
  the	
  initial	
  stages	
  of	
  the	
  
implementation	
  of	
  these	
  innovations.	
  Additionally,	
  it	
  appears	
  that	
  some	
  innovations	
  
provide	
  benefits	
  to	
  students	
  that	
  are	
  not	
  objectively	
  measured	
  by	
  this	
  evaluation.	
  	
  

•

Which	
  innovations	
  are	
  the	
  most	
  successful	
  (in	
  terms	
  of	
  graduation,	
  retention,	
  and	
  
GPA)?	
  
At	
  this	
  point	
  in	
  the	
  evaluation,	
  the	
  accelerated,	
  compressed,	
  contextualized	
  and	
  
mainstreaming	
  innovation	
  cluster	
  is	
  outperforming	
  the	
  other	
  innovations	
  in	
  terms	
  of	
  
retention	
  and	
  GPA.	
  

	
  
	
  

Prepared	
  by	
  JVA	
  Consulting	
  for	
  Complete	
  College	
  Colorado,	
  September	
  2012	
  

14	
  
Evaluation	
  of	
  the	
  Completion	
  Innovation	
  Challenge	
  Grant	
  
	
  
	
  
Additionally,	
  these	
  data	
  address	
  the	
  following	
  process	
  related	
  research	
  questions:	
  	
  
•

Were	
  the	
  innovations	
  implemented	
  as	
  intended?	
  
Despite	
  some	
  initial	
  hurdles,	
  and	
  with	
  a	
  few	
  exceptions,	
  these	
  innovations	
  are	
  being	
  
implemented	
  largely	
  as	
  originally	
  intended.	
  

•

What	
  can	
  the	
  colleges	
  and	
  CCCS	
  learn	
  from	
  the	
  implementation	
  of	
  the	
  seven	
  
innovations?	
  
The	
  evaluation	
  of	
  the	
  first	
  semester	
  of	
  the	
  implementation	
  of	
  the	
  CICG	
  project	
  has	
  
uncovered	
  a	
  variety	
  of	
  important	
  lessons:	
  
•

Messaging	
  is	
  important	
  

•

Appropriate	
  pace	
  ≠ 	
  faster	
  pace	
  	
  

•

There	
  are	
  unanticipated	
  benefits	
  to	
  some	
  of	
  these	
  innovations	
  

•

New	
  formats	
  are	
  resource	
  intensive	
  to	
  set	
  up	
  

•

New	
  formats	
  have	
  a	
  learning	
  curve	
  

•

Innovations	
  are	
  not	
  necessarily	
  replacements	
  for	
  a	
  traditional	
  format	
  	
  

Additionally,	
  several	
  potential	
  barriers	
  to	
  retention	
  not	
  related	
  to	
  these	
  innovations	
  
emerged	
  as	
  significantly	
  correlated	
  with	
  course	
  completion	
  (though	
  not	
  with	
  
respondents’	
  expectations	
  for	
  graduation	
  or	
  transfer).	
  	
  
These	
  findings	
  are	
  preliminary,	
  and	
  it	
  is	
  far	
  too	
  early	
  to	
  make	
  any	
  conclusive	
  judgments	
  about	
  
the	
  success	
  of	
  the	
  innovations	
  implemented	
  as	
  part	
  of	
  the	
  CCC	
  project.	
  Such	
  judgments	
  will	
  
come	
  later	
  as	
  data	
  are	
  collected	
  over	
  a	
  longer	
  period	
  of	
  time	
  and	
  these	
  innovations	
  mature.	
  
However,	
  the	
  data	
  collected	
  to	
  date	
  suggest	
  that	
  these	
  innovations	
  provide	
  a	
  benefit	
  to	
  
students	
  and	
  should	
  continue	
  to	
  be	
  implemented.	
  Despite	
  the	
  benefits,	
  however,	
  these	
  
innovations	
  are	
  unlikely	
  to	
  be	
  a	
  panacea	
  for	
  the	
  challenges	
  faced	
  by	
  DE.	
  
	
  

	
  

Prepared	
  by	
  JVA	
  Consulting	
  for	
  Complete	
  College	
  Colorado,	
  September	
  2012	
  

15	
  
Evaluation	
  of	
  the	
  Completion	
  Innovation	
  Challenge	
  Grant	
  
	
  
	
  

Introduction	
  and	
  Background	
  
The	
  Colorado	
  Department	
  of	
  Higher	
  Education	
  (CDHE)	
  received	
  a	
  Complete	
  College	
  America	
  
(CCA)	
  grant	
  to	
  fund	
  the	
  Completion	
  Innovation	
  Challenge	
  Grant	
  (CICG)	
  project.	
  The	
  CICG	
  project	
  
is	
  operated	
  by	
  the	
  Colorado	
  Community	
  College	
  System	
  (CCCS)	
  and	
  seeks	
  to	
  improve	
  college	
  
completion	
  rates	
  within	
  CCCS	
  by	
  aligning	
  developmental	
  education	
  (DE)	
  courses	
  with	
  innovative,	
  
evidence-­‐based	
  strategies	
  (innovations)	
  and	
  by	
  initiating	
  policy	
  reforms	
  that	
  ensure	
  the	
  state	
  
financially	
  rewards	
  institutions	
  that	
  successfully	
  increase	
  the	
  number	
  of	
  college	
  graduates.	
  	
  
CDHE	
  and	
  CCCS	
  contracted	
  with	
  JVA	
  Consulting,	
  LLC	
  (JVA)	
  to	
  act	
  as	
  a	
  third	
  party	
  evaluator	
  for	
  
the	
  innovation	
  portion	
  of	
  this	
  project.	
  This	
  evaluation	
  attempts	
  to	
  answer	
  the	
  following	
  research	
  
questions:	
  
•

Were	
  the	
  innovations	
  implemented	
  as	
  intended?	
  

•

What	
  can	
  the	
  colleges	
  and	
  CCCS	
  learn	
  from	
  the	
  implementation	
  of	
  the	
  seven	
  
innovations?	
  

•

Are	
  students	
  within	
  innovation	
  DE	
  programs	
  more	
  successful	
  (in	
  terms	
  of	
  graduation,	
  
retention	
  and	
  GPA)	
  than	
  those	
  in	
  standard	
  DE	
  programs?	
  

•

Which	
  innovations	
  are	
  the	
  most	
  successful	
  (in	
  terms	
  of	
  graduation,	
  retention	
  and	
  
GPA)?	
  

This	
  report	
  summarizes	
  the	
  methodology	
  of	
  this	
  evaluation,	
  and	
  the	
  findings	
  to	
  date,	
  which	
  
includes	
  data	
  from	
  the	
  first	
  semester	
  of	
  implementation	
  (spring	
  2012).	
  A	
  second	
  report	
  will	
  be	
  
produced	
  in	
  August	
  of	
  2013	
  and	
  will	
  include	
  data	
  from	
  the	
  first	
  three	
  semesters	
  of	
  
implementation	
  (spring	
  2012	
  through	
  spring	
  2013).	
  	
  Evaluation	
  will	
  continue	
  beyond	
  the	
  spring	
  
of	
  2013,	
  though	
  at	
  this	
  time	
  it	
  is	
  not	
  entirely	
  clear	
  what	
  form	
  this	
  evaluation	
  will	
  take.5	
  	
  
This	
  report	
  is	
  organized	
  around	
  four	
  major	
  sections	
  (1)	
  Introduction	
  and	
  Background,	
  (2)	
  
Methodology,	
  (3)	
  Findings	
  and	
  (4)	
  Conclusion.	
  The	
  Introduction	
  and	
  Background	
  section	
  (this	
  
section)	
  introduces	
  the	
  CICG	
  project	
  with	
  a	
  focus	
  on	
  the	
  need	
  for	
  the	
  project,	
  the	
  innovations	
  
implemented	
  and	
  the	
  institutions	
  involved.	
  The	
  methodology	
  section	
  discusses	
  the	
  overall	
  
design	
  of	
  the	
  evaluation,	
  each	
  of	
  the	
  data	
  sources,	
  the	
  analysis,	
  limitations	
  of	
  the	
  data	
  and	
  steps	
  
taken	
  to	
  protect	
  study	
  participants.	
  The	
  Findings	
  section	
  summarizes	
  the	
  key	
  findings	
  from	
  this	
  
study,	
  focusing	
  on	
  four	
  areas:	
  student	
  demographics,	
  student	
  experience,	
  process	
  evaluation	
  
(were	
  the	
  innovations	
  implemented	
  as	
  intended?)	
  and	
  outcome	
  evaluation	
  (how	
  successful	
  
were	
  the	
  innovations?).	
  

	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
   	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  
5

	
  The	
  CCA	
  grant	
  that	
  funds	
  these	
  innovations	
  and	
  their	
  evaluation	
  will	
  not	
  fund	
  third-­‐party	
  evaluation	
  beyond	
  the	
  
spring	
  of	
  2013.	
  However,	
  JVA	
  will	
  work	
  with	
  CCCS	
  to	
  ensure	
  evaluation	
  continues	
  in	
  some	
  form	
  beyond	
  this	
  time.	
  

	
  

Prepared	
  by	
  JVA	
  Consulting	
  for	
  Complete	
  College	
  Colorado,	
  September	
  2012	
  

16	
  
Evaluation	
  of	
  the	
  Completion	
  Innovation	
  Challenge	
  Grant	
  
	
  
	
  

The	
  Need	
  for	
  the	
  CICG	
  Project	
  
Students	
  referred	
  to	
  DE	
  courses	
  are	
  at	
  risk	
  of	
  failing	
  to	
  complete	
  their	
  degree—under	
  the	
  
current	
  circumstances,	
  half	
  will	
  not	
  even	
  complete	
  their	
  developmental	
  sequence.6	
  In	
  2009,	
  29%	
  
of	
  Colorado’s	
  college	
  students	
  required	
  remediation	
  in	
  reading,	
  writing	
  or	
  mathematics,	
  and	
  
over	
  half	
  (53%)	
  of	
  students	
  attending	
  two-­‐year	
  institutions	
  needed	
  remediation.	
  At	
  current	
  
rates,	
  of	
  100	
  students	
  enrolled	
  in	
  the	
  lowest	
  level	
  of	
  developmental	
  math,	
  only	
  four	
  will	
  
graduate.	
  	
  
In	
  response	
  to	
  this	
  need,	
  the	
  Higher	
  Education	
  Strategic	
  Planning	
  Steering	
  Committee	
  identified	
  
remediation	
  redesign	
  as	
  a	
  top	
  priority	
  for	
  Colorado,7	
  and	
  the	
  Governor’s	
  Office	
  and	
  its	
  partners,	
  
the	
  Colorado	
  Commission	
  on	
  Higher	
  Education	
  (CCHE),	
  the	
  Colorado	
  Department	
  of	
  Higher	
  
Education	
  (CDHE)	
  and	
  the	
  Colorado	
  Community	
  College	
  System	
  (CCCS)	
  propose	
  to	
  increase	
  the	
  
number	
  of	
  college	
  graduates	
  while	
  reducing	
  time	
  to	
  completion	
  by	
  transforming	
  the	
  delivery	
  of	
  
DE.	
  Thus,	
  the	
  CICG	
  project	
  is	
  aligned	
  with	
  a	
  larger	
  statewide	
  effort	
  to	
  improve	
  retention	
  among	
  
students	
  referred	
  to	
  DE	
  courses.	
  

Innovations	
  
As	
  part	
  of	
  the	
  CICG	
  project,	
  seven	
  innovations	
  in	
  developmental	
  education	
  are	
  being	
  
implemented	
  at	
  12	
  colleges	
  within	
  the	
  CCCS	
  system:	
  
n

n

n

Open	
   Entry/Exit	
   Math	
   Labs—open	
   entry/exit	
   math	
   labs	
   offer	
   developmental	
  
math	
   courses	
   that	
   allow	
   students	
   to	
   work	
   at	
   their	
   own	
   pace	
   and	
   to	
   test	
  
independently,	
  while	
  making	
  math	
  mentors	
  available	
  to	
  students	
  as	
  needed.	
  
Mainstreaming—mainstreaming	
   refers	
   to	
   an	
   approach	
   that	
   allows	
   students	
  
who	
   test	
   at	
   the	
   upper	
   range	
   of	
   developmental	
   education	
   to	
   enroll	
   in	
   college	
  
level	
  courses	
  with	
  one	
  additional	
  credit	
  hour	
  to	
  allow	
  them	
  time	
  to	
  strengthen	
  
their	
  foundational	
  skills.	
  
Accelerated	
   and	
   Compressed—accelerated	
   courses	
   alter	
   the	
   scheduling	
   of	
  
developmental	
   education	
   such	
   that	
   students	
   can	
   complete	
   required	
   courses	
  
faster	
  than	
  the	
  traditional	
  semester	
  sequence.	
  A	
  compressed	
  format	
  (e.g.,	
  five-­‐
week	
  courses)	
  is	
  one	
  type	
  of	
  accelerated	
  course,	
  though	
  there	
  are	
  others	
  (e.g.,	
  
combined	
  formats	
  where	
  030	
  and	
  060	
  courses	
  are	
  instructed	
  concurrently	
  in	
  the	
  
same	
  semester).	
  

	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
   	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  
6

	
  Bailey,	
  T.,	
  Jeong,	
  D.,	
  &	
  Sung-­‐Woo,	
  C.	
  (2009).	
  Referral,	
  enrollment,	
  and	
  completion	
  in	
  developmental	
  education	
  
sequences	
  in	
  community	
  colleges.	
  New	
  York:	
  Community	
  College	
  Research	
  Center,	
  Teachers	
  College,	
  Columbia	
  
University.	
  
7

	
  Colorado	
  Department	
  of	
  Higher	
  Education	
  (2010).	
  The	
  degree	
  dividend:	
  Building	
  our	
  economy	
  and	
  preserving	
  our	
  
quality	
  of	
  life:	
  Colorado	
  must	
  decide.	
  Colorado’s	
  Strategic	
  Plan	
  for	
  Higher	
  Education.	
  

	
  

Prepared	
  by	
  JVA	
  Consulting	
  for	
  Complete	
  College	
  Colorado,	
  September	
  2012	
  

17	
  
Evaluation	
  of	
  the	
  Completion	
  Innovation	
  Challenge	
  Grant	
  
	
  
	
  
n

n

n

n

n

Contextualization—contextualized	
   courses	
   embed	
   developmental	
   education	
  
within	
   the	
   context	
   of	
   program	
   specific	
   content.	
   Contextualized	
   courses	
   either:	
  
(1)	
   relate	
   developmental	
   competencies	
   to	
   career/technical	
   education	
  
competencies,	
   or	
   (2)	
   pair	
   developmental	
   education	
   courses	
   with	
   college	
   level	
  
courses.	
  
Mainstreaming—mainstreaming	
   refers	
   to	
   an	
   approach	
   that	
   allows	
   students	
   to	
  
enroll	
  in	
  college-­‐level	
  courses	
  with	
  additional	
  credit	
  hours	
  to	
  allow	
  them	
  time	
  to	
  
strengthen	
   their	
   foundational	
   skills	
   and	
   meet	
   developmental	
   education	
  
requirements.	
  	
  
Modularization—modularization	
  refers	
  to	
  the	
  reorganization	
  of	
  developmental	
  
education	
  courses	
  into	
  distinct	
  stand-­‐alone	
  modules	
  (or	
  mods)	
  that	
  can	
  be	
  taken	
  
in	
  a	
  variety	
  of	
  combinations.	
  Currently,	
  modularization	
  is	
  only	
  available	
  for	
  math	
  
courses.	
  
Diagnostic	
   Assessment—diagnostic	
   assessment	
   refers	
   to	
   a	
   pretest	
   used	
   to	
  
determine	
   the	
   appropriate	
   placement	
   of	
   students	
   based	
   on	
   the	
   requirements	
  
for	
   entrance	
   into	
   their	
   degree	
   program.	
   Currently,	
   diagnostic	
   assessment	
   is	
  
being	
   paired	
   with	
   modular	
   math	
   to	
   help	
   determine	
   the	
   appropriate	
   mods	
   for	
  
students	
  to	
  ensure	
  they	
  meet	
  the	
  requirements	
  of	
  their	
  degree	
  program.	
  
Online	
   Hybrid	
   Courses	
   for	
   Developmental	
   Education—these	
   innovations	
  
combine	
   elements	
   of	
   traditional	
   formats	
   with	
   online	
   classes.	
   In	
   particular,	
   live	
  
tutors	
  are	
  made	
  available	
  to	
  students	
  taking	
  online	
  courses.	
  

Some	
  of	
  these	
  innovations	
  are	
  being	
  implemented	
  as	
  stand-­‐alone	
  innovations,	
  while	
  others	
  are	
  
being	
  implemented	
  in	
  combination.	
  Additionally,	
  several	
  innovations	
  closely	
  overlap	
  in	
  practice.	
  	
  
While	
  there	
  were	
  not	
  sufficient	
  data	
  available	
  to	
  robustly	
  investigate	
  each	
  institution	
  
separately,	
  this	
  study	
  investigates	
  the	
  above	
  innovations	
  in	
  four	
  distinct	
  innovation	
  clusters:	
  
n

n

n

n

Open	
   Entry/Exit	
   Math	
   Labs—though	
   the	
   precise	
   meaning	
   of	
   “open”	
   differs	
  
among	
   institutions,	
   math	
   labs	
   are	
   implemented	
   consistently	
   enough	
   across	
  
CCCS	
  institutions	
  to	
  treat	
  them	
  as	
  a	
  distinct	
  group.	
  
Accelerated,	
   Compressed,	
   Contextualized	
   and	
   Mainstreaming—based	
   on	
  
faculty	
   interviews,	
   it	
   appears	
   that	
   in	
   practice	
   these	
   innovations	
   overlap	
  
substantially	
   within	
   CCCS	
   institutions.	
   Thus,	
   while	
   they	
   are	
   technically	
   distinct	
  
innovations,	
  they	
  are	
  clustered	
  together	
  for	
  analysis.	
  
Online	
   Hybrid—though	
   the	
   form	
   of	
   online	
   hybrid	
   courses	
   differs,	
   they	
   are	
  
similar	
  enough	
  to	
  be	
  treated	
  as	
  a	
  single	
  entity.	
  	
  
Modularization	
   and	
   Diagnostic	
   Assessments—one	
   of	
   the	
   three	
   institutions	
  
implementing	
   modular	
   math	
   is	
   not	
   using	
   diagnostic	
   assessments.	
   However,	
  
these	
  innovations	
  are	
  similar	
  enough	
  to	
  be	
  treated	
  as	
  a	
  single	
  cluster.	
  	
  

Though	
  there	
  is	
  some	
  variation	
  within	
  each	
  of	
  these	
  clusters,	
  for	
  analytical	
  purposes,	
  they	
  are	
  
treated	
  as	
  distinct	
  and	
  mutually	
  exclusive	
  sets	
  of	
  innovative	
  strategies.	
  	
  To	
  get	
  a	
  better	
  sense	
  
of	
  the	
  variation	
  within	
  each	
  cluster,	
  descriptions	
  of	
  the	
  specific	
  innovation	
  strategies	
  
implemented	
  by	
  each	
  institution	
  are	
  present	
  for	
  each	
  cluster	
  below.	
  

	
  

Prepared	
  by	
  JVA	
  Consulting	
  for	
  Complete	
  College	
  Colorado,	
  September	
  2012	
  

18	
  
Evaluation	
  of	
  the	
  Completion	
  Innovation	
  Challenge	
  Grant	
  
	
  
	
  
Open	
  Entry/Exit	
  Math	
  Labs	
  
Three	
  institutions	
  implemented	
  open	
  entry/exit	
  math	
  labs	
  as	
  part	
  of	
  the	
  CCC	
  project:	
  
n

n

n

Arapahoe	
   Community	
   College	
   (open	
   entry/exit	
   math	
   labs)—at	
   Arapahoe	
  
Community	
   College	
   (ACC),	
   developmental	
   math	
   courses	
   offered	
   in	
   a	
   math	
   lab	
  
format	
  are	
  referred	
  to	
  as	
  FLEX	
  classes.	
  	
  FLEX	
  classes	
  attempt	
  to	
  provide	
  students	
  
with	
  the	
  flexibility	
  to	
  decide	
  when	
  and	
  where	
  they	
  work,	
  though	
  the	
  format	
  is	
  
not	
  entirely	
  self-­‐paced	
  as	
  deadlines	
  are	
  provided	
  (but	
  students	
  can	
  work	
  faster	
  if	
  
desired).	
   	
   In	
   FLEX	
   classes,	
   students	
   complete	
   their	
   homework	
   online,	
   but	
  
complete	
   exams	
   on	
   campus.	
   	
   Additionally,	
   students	
   in	
   FLEX	
   courses	
   have	
   access	
  
to	
  the	
  FLEX	
  Lab	
  for	
  face-­‐to-­‐face	
  tutoring	
  and	
  support.	
  
Pikes	
   Peak	
   Community	
   College	
   (open	
   entry	
   math	
   labs)—at	
   Pikes	
   Peak	
  
Community	
   College	
   (PPCC),	
   math	
   labs	
   are	
   open	
   entry,	
   but	
   not	
   open	
   exit.	
   	
   This	
  
format	
   allow	
   students	
   to	
   work	
   at	
   their	
   own	
   pace	
   and	
   to	
   come	
   to	
   the	
   lab	
   as	
  
needed,	
   where	
   they	
   can	
   access	
   tutors	
   and	
   resources	
   such	
   as	
   practice	
   tests,	
  
graphing	
   calculators	
   or	
   instructional	
   DVDs.	
   	
   These	
   math	
   labs	
   are	
   also	
   where	
  
students	
  go	
  to	
  take	
  their	
  proctored	
  tests.	
  	
  
Trinidad	
   State	
   Junior	
   College	
   (open	
   entry/exit	
   math	
   labs)—at	
   Trinidad	
   State	
  
Junior	
   College	
   (TSJC),	
   math	
   labs	
   provide	
   self-­‐paced	
   instruction	
   incorporating	
  
both	
   the	
   MyMathLab	
   program	
   and	
   more	
   traditional	
   paper-­‐pencil	
   instruction.	
  	
  
Students	
  are	
  provided	
  deadlines	
  to	
  complete	
  their	
  courses,	
  but	
  are	
  able	
  to	
  flex	
  
their	
  time	
  within	
  set	
  time	
  blocks.	
  

Accelerated,	
  Compressed,	
  Contextualization	
  and	
  Mainstreaming	
  
Four	
  institutions	
  implemented	
  accelerated,	
  compressed,	
  contextualized	
  and/or	
  mainstreaming	
  
efforts	
  as	
  part	
  of	
  the	
  CCC	
  project:	
  
n

n

	
  

Community	
  College	
  of	
  Aurora	
  (accelerated,	
  compressed	
  and	
  mainstreaming)—
the	
   Community	
   College	
   of	
   Aurora	
   (CCA)	
   provides	
   a	
   form	
   of	
   accelerated	
   and	
  
compressed	
   courses	
   in	
   which	
   two	
   developmental	
   math	
   courses	
   are	
   combined	
  
into	
   one,	
   allowing	
   students	
   to	
   complete	
   their	
   developmental	
   requirements	
   in	
  
fifteen	
   weeks	
   instead	
   of	
   thirty	
   weeks.	
   	
   To	
   support	
   students	
   working	
   at	
   this	
  
accelerated	
   pace,	
   CCA	
   provides	
   extra	
   tutoring	
   opportunities	
   and	
   requires	
  
students	
   to	
   attend	
   a	
   minimum	
   amount	
   of	
   tutoring.	
   	
   Additionally,	
   CCA	
   is	
  
experimenting	
   with	
   some	
   mainstreaming	
   efforts	
   in	
   which	
   students	
   who	
   would	
  
normally	
  be	
  assigned	
  to	
  a	
  developmental	
  reading	
  course	
  (REA	
  090)	
  are	
  able	
  to	
  
meet	
  these	
  requirements	
  within	
  a	
  college	
  level	
  course	
  (BIO	
  111).	
  
Community	
   College	
   of	
   Denver	
   (accelerated,	
   compressed,	
   mainstreaming	
   and	
  
contextualized)—at	
   the	
   Community	
   College	
   of	
   Denver	
   (CCD)	
   the	
   FastStart	
  
program	
  combines	
  accelerated,	
  compressed	
  and	
  mainstreaming	
  approaches	
  to	
  
allow	
   students	
   to	
   complete	
   their	
   developmental	
   requirements	
   more	
   quickly.	
  	
  
FastStart	
  allows	
  students	
  to	
  complete	
  two	
  levels	
  of	
  classes	
  in	
  a	
  single	
  semester,	
  
or	
   to	
   combine	
   higher	
   developmental	
   education	
   courses	
   with	
   college	
   level	
  
courses	
   (mainstreaming).	
   	
   In	
   addition	
   to	
   FastStart,	
   CCD	
   students	
   are	
   able	
   to	
  
participate	
   in	
   learning	
   communities	
   where	
   they	
   spend	
   an	
   hour	
   per	
   week	
   with	
  
their	
  peers	
  and	
  the	
  instructor.	
  	
  Finally,	
  CCD	
  offers	
  a	
  contextualization	
  option	
  in	
  
Prepared	
  by	
  JVA	
  Consulting	
  for	
  Complete	
  College	
  Colorado,	
  September	
  2012	
  

19	
  
Evaluation	
  of	
  the	
  Completion	
  Innovation	
  Challenge	
  Grant	
  
	
  
	
  
which	
  students	
  apply	
  the	
  skills	
  they	
  learn	
  in	
  their	
  courses	
  to	
  develop	
  a	
  business	
  
plan	
  over	
  the	
  course	
  of	
  the	
  semester.	
  
n

n

Front	
   Range	
   Community	
   College	
   (accelerated	
   and	
   compressed)—Front	
   Range	
  
Community	
  College	
  (FRCC)	
  initially	
  intended	
  to	
  engage	
  in	
  mainstreaming	
  efforts,	
  
but	
  latter	
  shifted	
  to	
  an	
  accelerated	
  and	
  compressed	
  format	
  that	
  it	
  implemented	
  
at	
   its	
   Westminster	
   campus.	
   	
   To	
   build	
   on	
   this	
   effort,	
   FRCC	
   will	
   implement	
   what	
  
was	
   developed	
   at	
   the	
   Westminster	
   campus	
   at	
   the	
   Longmont	
   campus	
   to	
   allow	
  
students	
   from	
   multiple	
   campuses	
   (Longmont,	
   Greely	
   and	
   Fort	
   Collins)	
   to	
   take	
  
advantage	
  of	
  the	
  program.	
  	
  
Lamar	
   Community	
   College	
   (accelerated	
   and	
   compressed)—at	
   Lamar	
  
Community	
  College	
  (LCC),	
   developmental	
  education	
  is	
  offered	
  in	
  a	
  compressed	
  
format,	
   which	
   combines	
   two	
   classes	
   in	
   to	
   one.	
   	
   This	
   shortens	
   the	
   remediation	
  
track	
   and	
   allows	
   students	
   to	
   complete	
   their	
   developmental	
   education	
  
requirements	
  more	
  quickly.	
  

Online	
  Hybrid	
  Courses	
  
Two	
  institutions	
  implemented	
  online	
  hybrid	
  courses	
  as	
  part	
  of	
  the	
  CCC	
  project:	
  
n

n

Colorado	
   Community	
   College	
   Online	
   (online	
   hybrid	
   courses)—Colorado	
  
Community	
   College	
   Online	
   (CCCOnline)	
   provides	
   online	
   courses	
   for	
   colleges	
  
throughout	
   CCCS,	
   and	
   as	
   part	
   of	
   the	
   CICG	
   innovations	
   in	
   developmental	
  
education,	
   added	
   additional	
   in	
   house	
   tutoring	
   services	
   for	
   developmental	
  
English	
  and	
  math.	
  	
  This	
  approach	
  is	
  intended	
  to	
  add	
  a	
  personal	
  touch	
  to	
  online	
  
courses,	
   and	
   as	
   such,	
   to	
   combine	
   some	
   of	
   the	
   most	
   promising	
   elements	
   of	
  
traditional	
  and	
  online	
  courses.	
  
Otero	
   Junior	
   College	
   (online	
   hybrid	
   courses)—at	
   Otero	
   Junior	
   College	
   (OJC),	
  
students	
  in	
  developmental	
  math	
  are	
  able	
  to	
  take	
  advantage	
  of	
  an	
  online	
  hybrid	
  
format	
   by	
   combining	
   face-­‐to-­‐face	
   instruction	
   with	
   online	
   tutoring	
   services	
  
offered	
  by	
  CCCOnline.	
  

Modularization	
  and	
  Diagnostic	
  Assessments	
  
Three	
  institutions	
  implemented	
  modularization	
  and	
  diagnostic	
  assessments:	
  
n

n

	
  

Morgan	
   Community	
   College	
   (diagnostic	
   assessments	
   and	
   math	
   mods)—at	
  
Morgan	
   Community	
   College	
   (MCC),	
   students	
   take	
   the	
   ACCUPLACER	
   to	
   identify	
  
their	
   appropriate	
   placement	
   within	
   the	
   MyFoundationsLab	
   program.	
   	
   This	
  
program	
  provides	
  online	
  activities	
  and	
  assessments,	
  with	
  an	
  interactive	
  guided	
  
solution	
   and	
   sample	
   problem	
   for	
   each	
   exercise.	
   	
   This	
   program	
   also	
   provides	
  
students	
   with	
   a	
   variety	
   of	
   resources	
   including	
   video	
   lectures,	
   animations,	
   and	
  
audio	
  files.	
  	
  
Northeastern	
   Junior	
   College	
   (diagnostic	
   assessments	
   and	
   math	
   mods)—at	
  
Northeastern	
  Junior	
  College	
  (NJC),	
  all	
  developmental	
  math	
  has	
  been	
  converted	
  
to	
  a	
  modular	
  format.	
  	
  Students	
  take	
  the	
  ACCUPLACER	
  test	
  within	
  the	
  first	
  week	
  
of	
   classes	
   to	
   identify	
   which	
   modules	
   are	
   most	
   appropriate	
   for	
   them.	
   	
   Once	
  
placed,	
   student’s	
   complete	
   modules	
   at	
   their	
   own	
   pace,	
   but	
   are	
   provided	
  
timelines	
   to	
   guide	
   them	
   through	
   the	
   semester.	
   	
   To	
   advance	
   through	
   the	
  
Prepared	
  by	
  JVA	
  Consulting	
  for	
  Complete	
  College	
  Colorado,	
  September	
  2012	
  

20	
  
Evaluation	
  of	
  the	
  Completion	
  Innovation	
  Challenge	
  Grant	
  
	
  
	
  
sequence	
   students	
   have	
   to	
   pass	
   tests.	
   	
   Initially,	
   students	
   had	
   unlimited	
  
opportunities	
   to	
   take	
   these	
   tests,	
   but	
   NJC	
   found	
   that	
   this	
   led	
   many	
   of	
   its	
  
students	
   to	
   not	
   take	
   the	
   tests	
   seriously.	
   	
   As	
   such,	
   students	
   now	
   have	
   three	
  
opportunities	
  to	
  pass	
  each	
  test.	
  
n

Pueblo	
   Community	
   College	
   (diagnostic	
   assessments	
   and	
   math	
   mods)—at	
  
Pueblo	
   Community	
   College	
   (PCC),	
   students	
   have	
   option	
   to	
   take	
   their	
  
developmental	
  math	
  courses	
  in	
  modules	
  that	
  allow	
  them	
  to	
  work	
  at	
  their	
  own	
  
pace	
   utilizing	
   online	
   math	
   software.	
   	
   A	
   diagnostic	
   assessment	
   is	
   used	
   to	
   identify	
  
the	
   competency	
   areas	
   in	
   which	
   students	
   have	
   not	
   demonstrated	
   mastery,	
   and	
  
the	
   modules	
   that	
   are	
   associated	
   with	
   these	
   areas.	
   	
   Though	
   the	
   course	
   itself	
   is	
  
four	
   credit	
   hours,	
   over	
   the	
   course	
   of	
   a	
   semester,	
   students	
   can	
   complete	
   the	
  
equivalent	
  of	
  up	
  to	
  13	
  credit	
  hours	
  worth	
  of	
  developmental	
  math	
  coursework.	
  

Methodology	
  
This	
  study	
  has	
  two	
  design	
  components:	
  (1)	
  an	
  outcome	
  evaluation	
  component	
  and	
  (2)	
  a	
  process	
  
evaluation	
  component.	
  The	
  outcome	
  evaluation	
  was	
  designed	
  to	
  measure	
  what	
  these	
  
innovations	
  accomplished	
  last	
  semester,	
  and	
  to	
  answer	
  the	
  questions:	
  
n

n

Are	
   students	
   within	
   innovation	
   DE	
   programs	
   more	
   successful	
   (in	
   terms	
   of	
  
graduation,	
  retention	
  and	
  GPA)	
  than	
  those	
  in	
  standard	
  DE	
  programs?	
  
Which	
  innovations	
  are	
  the	
  most	
  successful	
  (in	
  terms	
  of	
  graduation,	
  retention	
  
and	
  GPA)?	
  

To	
  measure	
  these	
  outcomes,	
  a	
  case-­‐control	
  quasi-­‐experimental8	
  design	
  was	
  used.	
  In	
  this	
  design,	
  
student	
  performance	
  within	
  innovation	
  courses,	
  measured	
  by	
  institutional	
  data,	
  was	
  compared	
  
to	
  the	
  performance	
  of	
  students	
  within	
  control	
  groups.	
  These	
  data	
  were	
  supplemented	
  by	
  a	
  
student	
  survey,	
  which	
  provided	
  additional	
  data	
  to	
  ensure	
  the	
  differences	
  observed	
  between	
  
innovation	
  and	
  control	
  courses	
  were	
  not	
  the	
  result	
  of	
  other	
  factors.	
  	
  
The	
  process	
  evaluation	
  component	
  was	
  designed	
  to	
  answer	
  the	
  questions:	
  
n
n

Were	
  the	
  innovations	
  implemented	
  as	
  intended?	
  
What	
  can	
  the	
  colleges	
  and	
  CCCS	
  learn	
  from	
  the	
  implementation	
  of	
  the	
  seven	
  
innovations?	
  

To	
  answer	
  these	
  questions,	
  a	
  survey	
  was	
  administered	
  to	
  students	
  to	
  support	
  institutional	
  data	
  
derived	
  from	
  the	
  Student	
  Unit	
  Record	
  Data	
  System	
  (SURDS).	
  	
  Additionally,	
  a	
  faculty	
  survey	
  was	
  
administered	
  and	
  interviews	
  were	
  conducted	
  with	
  key	
  faculty.	
  The	
  sections	
  below	
  discuss	
  each	
  
	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
   	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  
8

	
  Quasi-­‐experimental	
  designs	
  differ	
  from	
  experimental	
  designs	
  in	
  that	
  treatments	
  or	
  interventions	
  are	
  not	
  assigned	
  
randomly.	
  In	
  this	
  case,	
  it	
  refers	
  to	
  the	
  fact	
  that	
  students	
  were	
  not	
  randomly	
  assigned	
  to	
  innovation	
  courses.	
  	
  

	
  

Prepared	
  by	
  JVA	
  Consulting	
  for	
  Complete	
  College	
  Colorado,	
  September	
  2012	
  

21	
  
Evaluation	
  of	
  the	
  Completion	
  Innovation	
  Challenge	
  Grant	
  
	
  
	
  
of	
  these	
  data	
  sources	
  in	
  more	
  detail,	
  how	
  the	
  control	
  groups	
  were	
  constructed,	
  the	
  analyses	
  
that	
  were	
  conducted,	
  and	
  the	
  limitations	
  of	
  the	
  study.	
  

Data	
  Sources	
  
The	
  data	
  used	
  in	
  this	
  study	
  were	
  gathered	
  from	
  four	
  sources:	
  (1)	
  CCCS	
  institutional	
  data,	
  (2)	
  a	
  
student	
  survey,	
  (3)	
  a	
  faculty	
  survey	
  and	
  (4)	
  interviews	
  with	
  faculty.	
  Each	
  of	
  these	
  sources	
  is	
  
discussed	
  below.	
  
Institutional	
  Data	
  
JVA	
  worked	
  with	
  CCCS	
  to	
  access	
  institutional	
  data	
  for	
  all	
  students	
  in	
  the	
  study	
  through	
  the	
  
Student	
  Unit	
  Record	
  Data	
  System	
  (SURDS).	
  These	
  data	
  included	
  demographics,	
  grades	
  and	
  
course	
  completion	
  variables.	
  Student	
  ID	
  numbers	
  were	
  used	
  as	
  unique	
  identifiers	
  to	
  match	
  
these	
  data	
  to	
  student	
  survey	
  data	
  (see	
  below).	
  However,	
  in	
  an	
  effort	
  to	
  maximize	
  protection	
  of	
  
student	
  data,	
  student	
  numbers	
  were	
  stripped	
  from	
  the	
  data	
  once	
  the	
  match	
  was	
  made	
  and	
  new	
  
identifiers	
  were	
  assigned.	
  
Student	
  Survey	
  
In	
  partnership	
  with	
  CCCS,	
  JVA	
  designed	
  and	
  administered	
  an	
  electronic	
  survey	
  to	
  all	
  students	
  in	
  
in	
  the	
  study.	
  This	
  survey	
  was	
  designed	
  to	
  ascertain	
  student	
  satisfaction	
  with	
  DE	
  programming,	
  
and	
  to	
  identify	
  challenges	
  DE	
  students	
  experience	
  that	
  may	
  act	
  as	
  barriers	
  to	
  graduation.	
  
Student	
  ID	
  numbers	
  were	
  used	
  to	
  match	
  these	
  data	
  to	
  the	
  institutional	
  data	
  collected	
  (see	
  
above)	
  but	
  were	
  stripped	
  once	
  the	
  match	
  was	
  made.	
  Additionally,	
  electronic	
  informed	
  consent	
  
was	
  acquired	
  as	
  part	
  of	
  the	
  survey	
  (see	
  Appendix	
  A	
  for	
  a	
  copy	
  of	
  the	
  survey).	
  
Faculty	
  Survey	
  
JVA	
  also	
  worked	
  with	
  CCCS	
  to	
  administer	
  an	
  electronic	
  survey	
  to	
  DE	
  faculty	
  and	
  staff	
  to	
  ascertain	
  
the	
  degree	
  to	
  which	
  faculty/staff	
  members	
  feel	
  each	
  innovation	
  is	
  being	
  implemented	
  as	
  
intended	
  and	
  faculty	
  perception	
  of	
  the	
  quality	
  of	
  the	
  innovations.	
  Skip	
  logic	
  was	
  used,	
  such	
  that	
  
respondents	
  were	
  presented	
  with	
  questions	
  tailored	
  to	
  the	
  innovations	
  their	
  institution	
  is	
  
implementing.	
  These	
  data	
  are	
  reported	
  in	
  aggregate,	
  and	
  all	
  personal	
  identifiers	
  (i.e.,	
  names	
  and	
  
email	
  addresses)	
  were	
  stripped	
  from	
  the	
  data.	
  Additionally,	
  electronic	
  informed	
  consent	
  was	
  
acquired	
  as	
  part	
  of	
  the	
  survey	
  (see	
  Appendix	
  B).	
  
Faculty	
  Interviews	
  
JVA	
  conducted	
  phone	
  interviews	
  lasting	
  approximately	
  15–30	
  minutes	
  with	
  16	
  key	
  faculty	
  and	
  
staff	
  members	
  to	
  ascertain	
  the	
  degree	
  to	
  which	
  each	
  innovation	
  is	
  being	
  implemented	
  as	
  
intended,	
  what	
  is	
  going	
  well	
  and	
  what	
  could	
  be	
  improved	
  upon.	
  Though	
  these	
  data	
  are	
  reported	
  
in	
  aggregate;	
  to	
  maintain	
  confidentiality,	
  names	
  are	
  not	
  attached	
  to	
  any	
  of	
  the	
  data.	
  
Additionally,	
  verbal	
  informed	
  consent	
  was	
  acquired	
  prior	
  to	
  engaging	
  in	
  the	
  interview	
  (see	
  
appendix	
  C).	
  

	
  

Prepared	
  by	
  JVA	
  Consulting	
  for	
  Complete	
  College	
  Colorado,	
  September	
  2012	
  

22	
  
Evaluation	
  of	
  the	
  Completion	
  Innovation	
  Challenge	
  Grant	
  
	
  
	
  

Control	
  Group	
  
To	
  build	
  control	
  groups,	
  students	
  in	
  traditional-­‐format	
  DE	
  courses	
  were	
  identified	
  and	
  matched	
  
by	
  institution	
  and	
  course—for	
  each	
  innovation	
  course,	
  a	
  corresponding	
  traditional	
  course	
  at	
  the	
  
same	
  institution	
  was	
  identified.	
  When	
  this	
  was	
  not	
  possible,	
  a	
  course	
  at	
  a	
  similar	
  institution	
  
(similar	
  in	
  terms	
  of	
  size	
  and	
  rural/urban	
  location)	
  was	
  identified.	
  This	
  process	
  ensured	
  that,	
  
whenever	
  possible,	
  innovation	
  courses	
  were	
  matched	
  to	
  control	
  courses	
  at	
  the	
  same	
  institution.	
  
As	
  such,	
  institutionally	
  specific	
  variables	
  were	
  controlled	
  as	
  much	
  as	
  possible.	
  Finally,	
  within	
  
each	
  innovation	
  cluster,	
  control	
  groups	
  were	
  matched	
  to	
  the	
  innovation	
  groups	
  along	
  four	
  
demographic	
  factors:	
  (1)	
  gender,	
  (2)	
  ethnicity,	
  (3)	
  race9	
  and	
  (4)	
  age.	
  In	
  the	
  findings	
  section	
  
below,	
  the	
  relative	
  match	
  between	
  control	
  and	
  innovation	
  groups	
  is	
  identified	
  for	
  each	
  
innovation	
  cluster.	
  

Data	
  Analysis	
  
The	
  quantitative	
  data	
  contained	
  within	
  this	
  report	
  (institutional	
  data	
  and	
  survey	
  data)	
  were	
  
analyzed	
  using	
  SPSS	
  (a	
  statistical	
  analysis	
  software	
  package).	
  Analyses	
  included	
  descriptive	
  
statistics	
  as	
  well	
  as	
  basic	
  inferential	
  statistics	
  including	
  Chi-­‐squared	
  distributions,	
  Pearson’s	
  
correlations,	
  independent	
  samples	
  t-­‐tests	
  and	
  analysis	
  of	
  variance	
  (ANOVA).	
  General	
  
descriptions	
  of	
  these	
  procedures	
  are	
  contained	
  within	
  footnotes	
  to	
  the	
  procedures	
  themselves.	
  
The	
  qualitative	
  data	
  contained	
  within	
  this	
  report	
  (interview	
  notes	
  and	
  open-­‐ended	
  survey	
  
questions)	
  were	
  analyzed	
  using	
  NVivo,	
  a	
  qualitative	
  data	
  analysis	
  software	
  package.	
  Using	
  
NVivo,	
  JVA	
  analysts	
  coded	
  the	
  data	
  by	
  source	
  (group)	
  and	
  general	
  themes.	
  These	
  original	
  codes	
  
were	
  then	
  reworked	
  (clustered	
  and	
  split)	
  until	
  coherent	
  stand-­‐alone	
  themes	
  were	
  produced.	
  	
  

Study	
  Limitations	
  
Though	
  this	
  evaluation	
  provides	
  valuable	
  information	
  on	
  the	
  CICG	
  program,	
  it	
  suffers	
  from	
  some	
  
limitations.	
  Chief	
  among	
  these	
  is	
  the	
  mismatch	
  between	
  the	
  time	
  horizon	
  of	
  the	
  study	
  and	
  the	
  
desired	
  outcomes.	
  In	
  particular,	
  college	
  retention	
  is	
  a	
  long-­‐term	
  measure	
  that	
  will	
  most	
  
effectively	
  be	
  measured	
  over	
  time.	
  As	
  such,	
  it	
  is	
  simply	
  too	
  early	
  to	
  reach	
  any	
  definite	
  
conclusions	
  regarding	
  the	
  impact	
  these	
  innovations	
  have	
  on	
  retention	
  (though	
  preliminary	
  
findings	
  are	
  contained	
  within).	
  Over	
  time,	
  this	
  limitation	
  will	
  be	
  partially	
  mitigated,	
  as	
  this	
  study	
  
will	
  continue	
  in	
  its	
  current	
  form	
  for	
  another	
  12	
  months,	
  and	
  then	
  continue	
  in	
  a	
  modified	
  form	
  
after	
  that.	
  	
  However,	
  data	
  on	
  long-­‐term	
  student	
  retention	
  will	
  not	
  be	
  available	
  for	
  several	
  years	
  
to	
  come	
  and	
  conclusive	
  data	
  may	
  never	
  become	
  available	
  given	
  the	
  already	
  limited	
  sample	
  size	
  
and	
  the	
  relatively	
  large	
  attrition	
  rates	
  experienced	
  by	
  this	
  population.	
  	
  

	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
   	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  
9

	
  In	
  these	
  data,	
  ethnicity	
  is	
  treated	
  as	
  a	
  separate	
  concept	
  from	
  race.	
  Ethnicity	
  consists	
  of	
  Latino/non-­‐Latino	
  and	
  race	
  
consists	
  of	
  five	
  separate	
  racial	
  categories.	
  

	
  

Prepared	
  by	
  JVA	
  Consulting	
  for	
  Complete	
  College	
  Colorado,	
  September	
  2012	
  

23	
  
Evaluation	
  of	
  the	
  Completion	
  Innovation	
  Challenge	
  Grant	
  
	
  
	
  
Another	
  limitation	
  is	
  the	
  ambiguity	
  contained	
  within	
  definitions	
  of	
  these	
  innovations—
institutions	
  define	
  and	
  implement	
  the	
  same	
  innovations	
  somewhat	
  differently.	
  This	
  presents	
  a	
  
challenge	
  to	
  evaluation	
  by	
  making	
  it	
  more	
  difficult	
  to	
  draw	
  clear	
  distinctions	
  between	
  
innovations.	
  This	
  challenge	
  is	
  exacerbated	
  by	
  the	
  relatively	
  small	
  number	
  of	
  students	
  contained	
  
within	
  specific	
  innovations	
  at	
  specific	
  institutions.	
  This	
  study	
  has	
  partially	
  overcome	
  both	
  of	
  
these	
  challenges	
  by	
  grouping	
  the	
  innovations	
  into	
  similar	
  innovation	
  clusters,	
  thus	
  providing	
  
clearly	
  distinct	
  groups	
  with	
  enough	
  cases	
  to	
  conduct	
  statistical	
  analysis.	
  	
  	
  
This	
  approach	
  has,	
  however,	
  presented	
  an	
  additional	
  limitation.	
  	
  By	
  combining	
  multiple	
  
innovations	
  into	
  clusters,	
  the	
  analysis	
  is	
  unable	
  to	
  make	
  distinctions	
  between	
  similar	
  innovations	
  
within	
  clusters.	
  	
  For	
  example,	
  though	
  three	
  institutions	
  (ACC,	
  PPCC	
  and	
  TSJC)	
  are	
  implementing	
  
open	
  entry/exit	
  math	
  labs,	
  the	
  ways	
  in	
  which	
  they	
  are	
  doing	
  so	
  vary	
  (see	
  the	
  descriptions	
  
above).	
  	
  This	
  limitation	
  is	
  particularly	
  stark	
  for	
  the	
  Accelerated,	
  Compressed,	
  Contextualized	
  and	
  
Mainstreaming	
  cluster.	
  	
  All	
  of	
  the	
  institutions	
  involved	
  in	
  this	
  cluster	
  engage	
  in	
  some	
  form	
  of	
  
accelerated	
  and	
  compressed	
  developmental	
  education,	
  but	
  several	
  include	
  either	
  
mainstreaming	
  or	
  contextualization	
  as	
  well.	
  	
  These	
  issues	
  are	
  compounded	
  by	
  the	
  fact	
  that	
  CCCS	
  
institutions	
  vary	
  dramatically	
  in	
  size,	
  and	
  as	
  a	
  result,	
  rather	
  large	
  portions	
  of	
  some	
  innovation	
  
clusters	
  are	
  made	
  up	
  of	
  single	
  institutions.	
  	
  This	
  means	
  that	
  a	
  particular	
  form	
  of	
  an	
  innovation	
  
implemented	
  by	
  a	
  particular	
  institution	
  may	
  disproportionally	
  influence	
  the	
  results	
  observed	
  for	
  
a	
  particular	
  cluster.	
  
An	
  additional	
  limitation	
  is	
  the	
  generally	
  small	
  sample	
  sizes	
  for	
  some	
  of	
  the	
  measures.	
  	
  In	
  
particular,	
  the	
  samples	
  for	
  data	
  from	
  faculty	
  (the	
  faculty	
  survey	
  and	
  interviews	
  with	
  faculty)	
  are	
  
too	
  small	
  to	
  be	
  considered	
  representative	
  of	
  the	
  views	
  of	
  all	
  faculty	
  members.	
  	
  Data	
  for	
  the	
  
students	
  is	
  less	
  limited,	
  as	
  sample	
  size	
  is	
  not	
  a	
  problem	
  for	
  the	
  institutional	
  data	
  grouped	
  by	
  
innovation	
  cluster.	
  	
  However,	
  the	
  sample	
  for	
  the	
  student	
  survey	
  is	
  too	
  small	
  to	
  be	
  considered	
  
representative	
  of	
  all	
  students	
  in	
  the	
  study10.	
  	
  As	
  such,	
  the	
  generalizability	
  of	
  these	
  findings	
  is	
  
somewhat	
  limited	
  and	
  extreme	
  caution	
  should	
  be	
  taken	
  when	
  extrapolating	
  from	
  these	
  
findings.	
  
Finally,	
  not	
  all	
  of	
  the	
  potential	
  benefits	
  associated	
  with	
  these	
  innovations	
  are	
  measured	
  by	
  this	
  
evaluation.	
  As	
  a	
  particularly	
  cogent	
  example,	
  some	
  innovations	
  appear	
  to	
  be	
  increasing	
  the	
  
amount	
  students	
  learn	
  by	
  slowing	
  the	
  pace	
  at	
  which	
  they	
  do	
  so	
  (see	
  the	
  Math	
  Labs	
  section	
  
below).	
  While	
  the	
  qualitative	
  data	
  included	
  below	
  are	
  able	
  to	
  partially	
  capture	
  this	
  possibility,	
  
the	
  extent	
  to	
  which	
  this	
  is	
  actually	
  occurring	
  is	
  not	
  possible	
  to	
  determine	
  here	
  as	
  the	
  data	
  
needed	
  to	
  draw	
  such	
  conclusions	
  were	
  not	
  collected	
  as	
  part	
  of	
  this	
  evaluation.	
  
	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
   	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  
10

	
  The	
  margin	
  of	
  error	
  for	
  the	
  student	
  survey	
  sample	
  (a	
  sample	
  of	
  153	
  from	
  a	
  population	
  of	
  
1,527)	
  is	
  7.52%	
  at	
  a	
  95%	
  confidence	
  level.	
  	
  To	
  attain	
  a	
  more	
  generally	
  acceptable	
  margin	
  of	
  error	
  
of	
  5%	
  while	
  retaining	
  a	
  95%	
  confidence	
  level,	
  a	
  sample	
  of	
  308	
  would	
  have	
  been	
  needed.	
  

	
  

Prepared	
  by	
  JVA	
  Consulting	
  for	
  Complete	
  College	
  Colorado,	
  September	
  2012	
  

24	
  
Evaluation	
  of	
  the	
  Completion	
  Innovation	
  Challenge	
  Grant	
  
	
  
	
  
Despite	
  these	
  limitations,	
  this	
  evaluation	
  provides	
  valuable	
  information	
  on	
  the	
  progress	
  made	
  
by	
  the	
  CICG	
  project.	
  	
  Though	
  these	
  findings	
  cannot	
  be	
  considered	
  conclusive,	
  they	
  do	
  provide	
  a	
  
sense	
  of	
  how	
  the	
  project	
  has	
  progressed	
  and	
  what	
  it	
  has	
  accomplished	
  thus	
  far.	
  

Findings	
  
Findings	
  are	
  presented	
  in	
  six	
  sections	
  below:	
  (1)	
  student	
  demographics,	
  (2)	
  student	
  experience,	
  
(3)	
  math-­‐lab	
  innovation	
  cluster,	
  (4)	
  accelerated,	
  compressed,	
  contextualized	
  and	
  mainstreaming	
  
innovation	
  cluster,	
  (5)	
  online	
  hybrid	
  innovation	
  cluster,	
  and	
  (6)	
  modularization	
  and	
  diagnostic	
  
assessment	
  innovation	
  cluster.	
  

Student	
  Demographics	
  
Student	
  demographic	
  data	
  for	
  this	
  study	
  are	
  from	
  two	
  sources:	
  (1)	
  institutional	
  data	
  and	
  (2)	
  the	
  
student	
  survey.	
  Data	
  from	
  each	
  of	
  these	
  sources	
  are	
  presented	
  below.	
  
Overall	
  (Institutional	
  Data)	
  
Figure	
  1	
  below	
  displays	
  the	
  gender	
  breakdown	
  for	
  the	
  entire	
  sample.	
  As	
  shown	
  below,	
  more	
  
than	
  half	
  (55%)	
  of	
  the	
  sample	
  is	
  female.	
  
Figure	
  1.	
  Gender	
  for	
  the	
  Entire	
  Sample	
  (n	
  =	
  1,527)	
  

45%	
  
55%	
  

Male	
  
Female	
  

	
  
Table	
  5	
  below	
  displays	
  the	
  ethnic	
  break	
  down	
  (Latino,	
  not	
  Latino)	
  for	
  the	
  sample.	
  As	
  shown	
  
below,	
  roughly	
  one-­‐fifth	
  (20.3%)	
  of	
  the	
  sample	
  identifies	
  as	
  Latino.	
  Figure	
  2	
  below	
  shows	
  the	
  
racial	
  breakdown	
  for	
  the	
  sample.	
  
Table	
  5.	
  Percentage	
  Latino	
  and	
  Not	
  Latino	
  for	
  Entire	
  Sample	
  (n	
  =	
  1,527)	
  
Ethnicity	
  
	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  Percentage	
  of	
  Sample	
  
Latino	
  

20.3%	
  

Not	
  Latino	
  

79.7%	
  

	
  

	
  

Prepared	
  by	
  JVA	
  Consulting	
  for	
  Complete	
  College	
  Colorado,	
  September	
  2012	
  

25	
  
Evaluation	
  of	
  the	
  Completion	
  Innovation	
  Challenge	
  Grant	
  

26	
  

	
  
	
  
As	
  shown	
  below,	
  almost	
  three-­‐fifths	
  (58%)	
  of	
  the	
  sample	
  identifies	
  as	
  white,	
  and	
  just	
  over	
  one-­‐
fifth	
  (22%)	
  did	
  not	
  identify	
  as	
  any	
  of	
  the	
  available	
  racial	
  categories.	
  Additionally,	
  the	
  mean	
  age	
  
for	
  this	
  sample	
  was	
  28.06	
  (SD	
  =	
  9.676),	
  ranging	
  from	
  17	
  years	
  old	
  to	
  72	
  years	
  old.	
  
Figure	
  2.	
  Race	
  by	
  Group	
  for	
  the	
  Entire	
  Sample	
  (n	
  =	
  1,527)	
  
100%	
  
80%	
  
58%	
  

60%	
  
40%	
  

22%	
  

20%	
  

3%	
  

10%	
  

2%	
  

5%	
  

1%	
  

0%	
  
Asian	
  

Black	
  

Naove	
  American	
  

Pacific	
  Islander	
  

White	
  

Mixed	
  

Not	
  ID'd	
  

	
  
Student	
  Survey	
  Respondents	
  
Figure	
  3	
  below	
  displays	
  the	
  gender	
  breakdown	
  for	
  the	
  student	
  survey	
  respondents.	
  As	
  shown	
  
below,	
  just	
  over	
  two-­‐thirds	
  (70%)	
  of	
  student	
  survey	
  respondents	
  identified	
  as	
  female.	
  This	
  is	
  a	
  
higher	
  proportion	
  than	
  for	
  the	
  sample	
  as	
  a	
  whole.	
  
Figure	
  3.	
  Gender	
  for	
  Survey	
  Data	
  (n	
  =	
  153)	
  

30%	
  
Male	
  
Female	
  
70%	
  

	
  
Table	
  6	
  below	
  displays	
  the	
  ethnic	
  break	
  down	
  (Latino,	
  not	
  Latino)	
  for	
  survey	
  respondents.	
  As	
  
shown	
  below,	
  just	
  under	
  one-­‐fifth	
  (17.0%)	
  of	
  the	
  sample	
  identifies	
  as	
  Latino.	
  This	
  is	
  slightly	
  
lower	
  than	
  the	
  sample.	
  

	
  

Prepared	
  by	
  JVA	
  Consulting	
  for	
  Complete	
  College	
  Colorado,	
  September	
  2012	
  
Evaluation	
  of	
  the	
  Completion	
  Innovation	
  Challenge	
  Grant	
  

27	
  

	
  
	
  
Table	
  6.	
  Percentage	
  Latino	
  and	
  Not	
  Latino	
  for	
  Survey	
  Data	
  (n	
  =	
  153)	
  
Ethnicity	
  
	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  Percentage	
  of	
  Sample	
  
Latino	
  

17.0%	
  

Not	
  Latino	
  

83.0%	
  

	
  
Figure	
  4	
  below	
  shows	
  the	
  racial	
  breakdown	
  for	
  student	
  survey	
  respondents.	
  As	
  shown	
  below,	
  
almost	
  two-­‐thirds	
  (65%)	
  of	
  survey	
  respondents	
  identify	
  as	
  white,	
  which	
  is	
  higher	
  than	
  the	
  
sample	
  as	
  a	
  whole.	
  Additionally,	
  16%	
  did	
  not	
  identify	
  as	
  any	
  of	
  the	
  available	
  racial	
  categories,	
  
which	
  is	
  lower	
  than	
  the	
  sample	
  as	
  a	
  whole.	
  	
  
Figure	
  4.	
  Gender	
  by	
  Group	
  for	
  Student	
  Survey	
  Respondents	
  (n	
  =	
  153)	
  
100%	
  
90%	
  
80%	
  
70%	
  
60%	
  
50%	
  
40%	
  
30%	
  
20%	
  
10%	
  
0%	
  

65%	
  

2%	
  

Asian	
  

Black	
  

8%	
  

16%	
  
3%	
  

Naove	
  American	
  

6%	
  

1%	
  

Pacific	
  Islander	
  

White	
  

Mixed	
  

Not	
  ID'd	
  

	
  
As	
  shown	
  in	
  Table	
  7,	
  below,	
  the	
  mean	
  age	
  for	
  survey	
  respondents	
  is	
  31,	
  which	
  is	
  three	
  years	
  
older	
  than	
  the	
  average	
  for	
  the	
  sample	
  as	
  a	
  whole.	
  Additionally,	
  survey	
  respondents	
  have	
  an	
  
average	
  of	
  almost	
  one	
  child	
  (for	
  both	
  children	
  under	
  18	
  generally	
  and	
  children	
  under	
  18	
  living	
  
with	
  the	
  respondent).	
  
Table	
  7.	
  Mean	
  (SD)	
  Age,	
  Number	
  of	
  Children	
  Under	
  18	
  and	
  Number	
  of	
  Children	
  Under	
  18	
  Living	
  
with	
  Respondent	
  for	
  Student	
  Survey	
  Data	
  (n	
  =	
  153)	
  
Item	
  
	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  Mean	
  (SD)	
  
Age	
  of	
  respondent	
  

31.31(11.26)	
  

Children	
  under	
  18	
  

0.97(1.35)	
  

Children	
  under	
  18	
  living	
  with	
  respondent	
  

0.99(1.19)	
  

	
  

	
  

Prepared	
  by	
  JVA	
  Consulting	
  for	
  Complete	
  College	
  Colorado,	
  September	
  2012	
  
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation
College America Grant Reports- Final Evaluation

Contenu connexe

Tendances

Impact of Corporate Governance on Leverage and Firm performance: Mauritius
Impact of Corporate Governance on Leverage and Firm performance: MauritiusImpact of Corporate Governance on Leverage and Firm performance: Mauritius
Impact of Corporate Governance on Leverage and Firm performance: MauritiusAkshay Ramoogur
 
2014 Namibia Science Report Final-1
2014 Namibia Science Report Final-12014 Namibia Science Report Final-1
2014 Namibia Science Report Final-1Ronald Guthrie
 
Summary report
Summary reportSummary report
Summary reporteconsultbw
 
Attachment revision booklet
Attachment revision bookletAttachment revision booklet
Attachment revision bookletsssfcpsychology
 
Air Force Enhancing Performance Under Stress
Air Force Enhancing Performance Under StressAir Force Enhancing Performance Under Stress
Air Force Enhancing Performance Under StressJA Larson
 
Soa In The Real World
Soa In The Real WorldSoa In The Real World
Soa In The Real Worldssiliveri
 
FasterCures-GivingSmarter-Alzheimers-April2015
FasterCures-GivingSmarter-Alzheimers-April2015FasterCures-GivingSmarter-Alzheimers-April2015
FasterCures-GivingSmarter-Alzheimers-April2015LaTese Briggs, Ph.D.
 
(E book philosophy) [lao tzu] the tao-te-ching (three translations)
(E book philosophy)   [lao tzu] the tao-te-ching (three translations)(E book philosophy)   [lao tzu] the tao-te-ching (three translations)
(E book philosophy) [lao tzu] the tao-te-ching (three translations)GiselePB
 
YCT 4 Chinese Intensive Reading for Kids Y41002 新中小学生汉语考试 sample
YCT 4 Chinese Intensive Reading for Kids Y41002 新中小学生汉语考试 sampleYCT 4 Chinese Intensive Reading for Kids Y41002 新中小学生汉语考试 sample
YCT 4 Chinese Intensive Reading for Kids Y41002 新中小学生汉语考试 sampleLEGOO MANDARIN
 
entergy 2007 final IG
entergy 2007 final IGentergy 2007 final IG
entergy 2007 final IGfinance24
 
Predicting lead poisoning levels in chicago neighborhoods capstone
Predicting lead poisoning levels in chicago neighborhoods capstonePredicting lead poisoning levels in chicago neighborhoods capstone
Predicting lead poisoning levels in chicago neighborhoods capstoneCarlos Ardila
 
NHLBI-Strategic-Vision-2016_FF
NHLBI-Strategic-Vision-2016_FFNHLBI-Strategic-Vision-2016_FF
NHLBI-Strategic-Vision-2016_FFYasin Patel
 

Tendances (20)

MBS paper v2
MBS paper v2MBS paper v2
MBS paper v2
 
Impact of Corporate Governance on Leverage and Firm performance: Mauritius
Impact of Corporate Governance on Leverage and Firm performance: MauritiusImpact of Corporate Governance on Leverage and Firm performance: Mauritius
Impact of Corporate Governance on Leverage and Firm performance: Mauritius
 
Memory revision booklet
Memory revision bookletMemory revision booklet
Memory revision booklet
 
2014 Namibia Science Report Final-1
2014 Namibia Science Report Final-12014 Namibia Science Report Final-1
2014 Namibia Science Report Final-1
 
CSF of ERP Implementations in Sri Lankan Companies
CSF of ERP Implementations in Sri Lankan CompaniesCSF of ERP Implementations in Sri Lankan Companies
CSF of ERP Implementations in Sri Lankan Companies
 
Summary report
Summary reportSummary report
Summary report
 
Attachment revision booklet
Attachment revision bookletAttachment revision booklet
Attachment revision booklet
 
Air Force Enhancing Performance Under Stress
Air Force Enhancing Performance Under StressAir Force Enhancing Performance Under Stress
Air Force Enhancing Performance Under Stress
 
Enrollment Management Plan
Enrollment Management PlanEnrollment Management Plan
Enrollment Management Plan
 
Soa In The Real World
Soa In The Real WorldSoa In The Real World
Soa In The Real World
 
FasterCures-GivingSmarter-Alzheimers-April2015
FasterCures-GivingSmarter-Alzheimers-April2015FasterCures-GivingSmarter-Alzheimers-April2015
FasterCures-GivingSmarter-Alzheimers-April2015
 
Rand rr2504
Rand rr2504Rand rr2504
Rand rr2504
 
(E book philosophy) [lao tzu] the tao-te-ching (three translations)
(E book philosophy)   [lao tzu] the tao-te-ching (three translations)(E book philosophy)   [lao tzu] the tao-te-ching (three translations)
(E book philosophy) [lao tzu] the tao-te-ching (three translations)
 
YCT 4 Chinese Intensive Reading for Kids Y41002 新中小学生汉语考试 sample
YCT 4 Chinese Intensive Reading for Kids Y41002 新中小学生汉语考试 sampleYCT 4 Chinese Intensive Reading for Kids Y41002 新中小学生汉语考试 sample
YCT 4 Chinese Intensive Reading for Kids Y41002 新中小学生汉语考试 sample
 
entergy 2007 final IG
entergy 2007 final IGentergy 2007 final IG
entergy 2007 final IG
 
Predicting lead poisoning levels in chicago neighborhoods capstone
Predicting lead poisoning levels in chicago neighborhoods capstonePredicting lead poisoning levels in chicago neighborhoods capstone
Predicting lead poisoning levels in chicago neighborhoods capstone
 
25 quick formative assessments
25 quick formative assessments25 quick formative assessments
25 quick formative assessments
 
NHLBI-Strategic-Vision-2016_FF
NHLBI-Strategic-Vision-2016_FFNHLBI-Strategic-Vision-2016_FF
NHLBI-Strategic-Vision-2016_FF
 
luan van thac si how to use improve debating skills for third year
luan van thac si how to use improve debating skills for third year luan van thac si how to use improve debating skills for third year
luan van thac si how to use improve debating skills for third year
 
FYP
FYPFYP
FYP
 

En vedette (19)

Accelerated de final
Accelerated de finalAccelerated de final
Accelerated de final
 
Tourney
TourneyTourney
Tourney
 
Tcn 2014 05_01_final
Tcn 2014 05_01_finalTcn 2014 05_01_final
Tcn 2014 05_01_final
 
Security Awareness Presentation Fall 2013
Security Awareness Presentation Fall 2013Security Awareness Presentation Fall 2013
Security Awareness Presentation Fall 2013
 
Basketball tourney
Basketball tourneyBasketball tourney
Basketball tourney
 
Tourney 3
Tourney 3Tourney 3
Tourney 3
 
College Fact Sheets
College Fact SheetsCollege Fact Sheets
College Fact Sheets
 
biodiversidad FAO
biodiversidad FAObiodiversidad FAO
biodiversidad FAO
 
Abstract art
Abstract artAbstract art
Abstract art
 
Redesigned Course Outcomes
Redesigned Course OutcomesRedesigned Course Outcomes
Redesigned Course Outcomes
 
Curriculo
CurriculoCurriculo
Curriculo
 
Hollywood
HollywoodHollywood
Hollywood
 
Rising star2014 vfinal program
Rising star2014 vfinal programRising star2014 vfinal program
Rising star2014 vfinal program
 
TAA COETC Award Letter
TAA COETC Award LetterTAA COETC Award Letter
TAA COETC Award Letter
 
Escuelas pedg.
Escuelas pedg.Escuelas pedg.
Escuelas pedg.
 
Azar
AzarAzar
Azar
 
Diapositivasvideocurriculoescolardeloja 120916234326-phpapp01
Diapositivasvideocurriculoescolardeloja 120916234326-phpapp01Diapositivasvideocurriculoescolardeloja 120916234326-phpapp01
Diapositivasvideocurriculoescolardeloja 120916234326-phpapp01
 
Diapositivas etica profesional
Diapositivas etica profesionalDiapositivas etica profesional
Diapositivas etica profesional
 
Entrevista
EntrevistaEntrevista
Entrevista
 

Similaire à College America Grant Reports- Final Evaluation

Consultants estimating manual
Consultants estimating manualConsultants estimating manual
Consultants estimating manualDaniel Libe
 
Mth201 COMPLETE BOOK
Mth201 COMPLETE BOOKMth201 COMPLETE BOOK
Mth201 COMPLETE BOOKmusadoto
 
Data analytics in education domain
Data analytics in education domainData analytics in education domain
Data analytics in education domainRishi Raj
 
Project proposal 32
Project  proposal 32Project  proposal 32
Project proposal 32Firomsa Taye
 
Citrus College - NASA SL Criticla Design Review
Citrus College - NASA SL Criticla Design ReviewCitrus College - NASA SL Criticla Design Review
Citrus College - NASA SL Criticla Design ReviewJoseph Molina
 
Data analysis with R.pdf
Data analysis with R.pdfData analysis with R.pdf
Data analysis with R.pdfPepeMara
 
Joint Ventures and Partner Selection using AHP.pdf
Joint Ventures and Partner Selection using AHP.pdfJoint Ventures and Partner Selection using AHP.pdf
Joint Ventures and Partner Selection using AHP.pdfausamah
 
L9 for Stress
L9 for StressL9 for Stress
L9 for Stresslami9caps
 
Social Vulnerability Assessment Tools for Climate Change and DRR Programming
Social Vulnerability Assessment Tools for Climate Change and DRR ProgrammingSocial Vulnerability Assessment Tools for Climate Change and DRR Programming
Social Vulnerability Assessment Tools for Climate Change and DRR ProgrammingUNDP Climate
 
Usability of Web Based Financial Services
Usability of Web Based Financial ServicesUsability of Web Based Financial Services
Usability of Web Based Financial ServicesAustin Dimmer
 
Senior Project Report
Senior Project ReportSenior Project Report
Senior Project ReportTimothy Eck
 
Work related learning
Work related learningWork related learning
Work related learningBooksMantra
 
IUCRC_EconImpactFeasibilityReport_FinalFinal
IUCRC_EconImpactFeasibilityReport_FinalFinalIUCRC_EconImpactFeasibilityReport_FinalFinal
IUCRC_EconImpactFeasibilityReport_FinalFinalJay Lee
 
Lao PDR National Rural Sanitation Products Supply Chain Study
Lao PDR National Rural Sanitation Products Supply Chain StudyLao PDR National Rural Sanitation Products Supply Chain Study
Lao PDR National Rural Sanitation Products Supply Chain StudyHetal Patel
 
Staff Report and Recommendations in Value of DER, 10-27-16
Staff Report and Recommendations in Value of DER, 10-27-16Staff Report and Recommendations in Value of DER, 10-27-16
Staff Report and Recommendations in Value of DER, 10-27-16Dennis Phayre
 

Similaire à College America Grant Reports- Final Evaluation (20)

Consultants estimating manual
Consultants estimating manualConsultants estimating manual
Consultants estimating manual
 
Mth201 COMPLETE BOOK
Mth201 COMPLETE BOOKMth201 COMPLETE BOOK
Mth201 COMPLETE BOOK
 
Data analytics in education domain
Data analytics in education domainData analytics in education domain
Data analytics in education domain
 
Project proposal 32
Project  proposal 32Project  proposal 32
Project proposal 32
 
Citrus College - NASA SL Criticla Design Review
Citrus College - NASA SL Criticla Design ReviewCitrus College - NASA SL Criticla Design Review
Citrus College - NASA SL Criticla Design Review
 
Data analysis with R.pdf
Data analysis with R.pdfData analysis with R.pdf
Data analysis with R.pdf
 
Joint Ventures and Partner Selection using AHP.pdf
Joint Ventures and Partner Selection using AHP.pdfJoint Ventures and Partner Selection using AHP.pdf
Joint Ventures and Partner Selection using AHP.pdf
 
L9 for Stress
L9 for StressL9 for Stress
L9 for Stress
 
Social Vulnerability Assessment Tools for Climate Change and DRR Programming
Social Vulnerability Assessment Tools for Climate Change and DRR ProgrammingSocial Vulnerability Assessment Tools for Climate Change and DRR Programming
Social Vulnerability Assessment Tools for Climate Change and DRR Programming
 
Spiral b of master thesis new1
Spiral b  of master thesis   new1Spiral b  of master thesis   new1
Spiral b of master thesis new1
 
Usability of Web Based Financial Services
Usability of Web Based Financial ServicesUsability of Web Based Financial Services
Usability of Web Based Financial Services
 
ACL_Rehab
ACL_RehabACL_Rehab
ACL_Rehab
 
Senior Project Report
Senior Project ReportSenior Project Report
Senior Project Report
 
Work related learning
Work related learningWork related learning
Work related learning
 
IUCRC_EconImpactFeasibilityReport_FinalFinal
IUCRC_EconImpactFeasibilityReport_FinalFinalIUCRC_EconImpactFeasibilityReport_FinalFinal
IUCRC_EconImpactFeasibilityReport_FinalFinal
 
thesis
thesisthesis
thesis
 
Rand rr2637
Rand rr2637Rand rr2637
Rand rr2637
 
Lao PDR National Rural Sanitation Products Supply Chain Study
Lao PDR National Rural Sanitation Products Supply Chain StudyLao PDR National Rural Sanitation Products Supply Chain Study
Lao PDR National Rural Sanitation Products Supply Chain Study
 
2002annualreport[1]
2002annualreport[1]2002annualreport[1]
2002annualreport[1]
 
Staff Report and Recommendations in Value of DER, 10-27-16
Staff Report and Recommendations in Value of DER, 10-27-16Staff Report and Recommendations in Value of DER, 10-27-16
Staff Report and Recommendations in Value of DER, 10-27-16
 

Plus de COCommunityCollegeSystem (20)

High schoolers
High schoolersHigh schoolers
High schoolers
 
Graduate
GraduateGraduate
Graduate
 
Historical
HistoricalHistorical
Historical
 
Ged
GedGed
Ged
 
Concert
ConcertConcert
Concert
 
Tsjc ladies
Tsjc ladiesTsjc ladies
Tsjc ladies
 
Golf and ladies
Golf and ladiesGolf and ladies
Golf and ladies
 
Hot classes
Hot classesHot classes
Hot classes
 
Show
ShowShow
Show
 
Art
ArtArt
Art
 
Valley campus 2 21-15 1
Valley campus 2 21-15 1Valley campus 2 21-15 1
Valley campus 2 21-15 1
 
Tcn 2015 02_24_final 1
Tcn 2015 02_24_final 1Tcn 2015 02_24_final 1
Tcn 2015 02_24_final 1
 
Tsjc foundation
Tsjc foundationTsjc foundation
Tsjc foundation
 
Rep
RepRep
Rep
 
Best
BestBest
Best
 
Acat
AcatAcat
Acat
 
Tough
ToughTough
Tough
 
Valley campus 2 6-15 1
Valley campus 2 6-15 1Valley campus 2 6-15 1
Valley campus 2 6-15 1
 
Three sports stories
Three sports storiesThree sports stories
Three sports stories
 
Sweep
SweepSweep
Sweep
 

Dernier

Arihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdfArihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdfchloefrazer622
 
MENTAL STATUS EXAMINATION format.docx
MENTAL     STATUS EXAMINATION format.docxMENTAL     STATUS EXAMINATION format.docx
MENTAL STATUS EXAMINATION format.docxPoojaSen20
 
Crayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon ACrayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon AUnboundStockton
 
How to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxHow to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxmanuelaromero2013
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingTechSoup
 
Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)eniolaolutunde
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Sapana Sha
 
Science 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsScience 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsKarinaGenton
 
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdfBASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdfSoniaTolstoy
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxNirmalaLoungPoorunde1
 
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17Celine George
 
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Krashi Coaching
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdfssuser54595a
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentInMediaRes1
 
Hybridoma Technology ( Production , Purification , and Application )
Hybridoma Technology  ( Production , Purification , and Application  ) Hybridoma Technology  ( Production , Purification , and Application  )
Hybridoma Technology ( Production , Purification , and Application ) Sakshi Ghasle
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeThiyagu K
 

Dernier (20)

Arihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdfArihant handbook biology for class 11 .pdf
Arihant handbook biology for class 11 .pdf
 
MENTAL STATUS EXAMINATION format.docx
MENTAL     STATUS EXAMINATION format.docxMENTAL     STATUS EXAMINATION format.docx
MENTAL STATUS EXAMINATION format.docx
 
Crayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon ACrayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon A
 
Model Call Girl in Bikash Puri Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Bikash Puri  Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Bikash Puri  Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Bikash Puri Delhi reach out to us at 🔝9953056974🔝
 
How to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxHow to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptx
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy Consulting
 
Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
 
Science 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsScience 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its Characteristics
 
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdfBASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
 
Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptx
 
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
 
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media Component
 
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdfTataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
 
Hybridoma Technology ( Production , Purification , and Application )
Hybridoma Technology  ( Production , Purification , and Application  ) Hybridoma Technology  ( Production , Purification , and Application  )
Hybridoma Technology ( Production , Purification , and Application )
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and Mode
 
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
 

College America Grant Reports- Final Evaluation

  • 1.                         Completion  Innovation  Challenge  Grant  Evaluation                               Evaluation  of  the  Completion  Innovation  Challenge  Grant   Prepared  by:  JVA  Consulting,  LLC   September  2012  
  • 2. Evaluation  of  the  Completion  Innovation  Challenge  Grant         Table  of  Contents   List  of  Figures  ...............................................................................................................   2   List  of  Tables  ................................................................................................................   3   Executive  Summary  ......................................................................................................   5   Methodology  .............................................................................................................  21   Findings  .....................................................................................................................  25   Conclusion  .................................................................................................................  53   Appendix  A:  Student  Survey  .......................................................................................  57   Appendix  B:  Faculty  Survey  ........................................................................................  62   Appendix  C:  Faculty  Interview  Guide  ..........................................................................  85       Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   1  
  • 3. Evaluation  of  the  Completion  Innovation  Challenge  Grant         List  of  Figures   Figure  1.  Gender  for  the  Entire  Sample  (n  =  1,527)  ......................................................................  25   Figure  2.  Race  by  Group  for  the  Entire  Sample  (n  =  1,527)  ...........................................................  26   Figure  3.  Gender  for  Survey  Data  (n  =  153)  ...................................................................................  26   Figure  4.  Gender  by  Group  for  Student  Survey  Respondents  (n  =  153)  ........................................  27   Figure  5.  Hours  Worked  Per  Week  During  the  Semester  for  Student  Survey  Respondents  (n  =   153)  ...............................................................................................................................................  28   Figure  6.  Relationship  Status  of  Survey  Respondents  (n  =  153)  ....................................................  28   Figure  7.  Faculty  Perception  of  Open  Entry-­‐Exit  Math  Labs  Compared  to  a  Traditional  Format  (n  =   7;  ACC  =  1,  PPCC  =  6,  TSJC  =  0)  ......................................................................................................  35   Figure  8.  Faculty  Preference  for  the  Continuation  of  Open  Entry-­‐Exit  Math  Labs  (n  =  7;  ACC  =  1,   PPCC  =  6,  TSJC  =  0)  ........................................................................................................................  35   Figure  9.  Faculty  Perception  of  Accelerated  and  Compressed  Courses  Compared  to  a  Traditional   Format  (n  =  5;  CCA  =  0,  CCD  =  0,  FRCC  =  5,  LCC  =  0)  ......................................................................  40   Figure  10.  Faculty  Preference  for  the  Continuation  of  Accelerated  and  Compressed  Courses  (n  =   5;  FRCC  =  5,  LCC  =  0)  ......................................................................................................................  41   Figure  11.  Faculty  Perception  of  Modularized  Courses  With  Diagnostic  Assessments  Compared   to  a  Traditional  Format  (n  =  3;  MCC  =  1,  NJC  =  0,  PCC  =  2)  ...........................................................  49   Figure  12.  Faculty  Preference  for  the  Continuation  of  Modularization  and  Diagnostic   Assessments  (n  =  3;  MCC  =  1,  NJC  =  0,  PCC  =  2)  ............................................................................  50       Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   2  
  • 4. Evaluation  of  the  Completion  Innovation  Challenge  Grant       List  of  Tables   Table  1.  Overview  of  Math  Labs  ....................................................................................................  11   Table  2.  Overview  of  Accelerated,  Compressed,  Contextualized  and  Mainstreaming  .................  12   Table  3.  Overview  of  Online  Hybrid  Classes  ..................................................................................  13   Table  4.  Overview  of  Modularization  and  Diagnostic  Assessments  ..............................................  14   Table  5.  Percentage  Latino  and  Not  Latino  for  Entire  Sample  (n  =  1,527)  ....................................  25   Table  6.  Percentage  Latino  and  Not  Latino  for  Survey  Data  (n  =  153)  ..........................................  27   Table  7.  Mean  (SD)  Age,  Number  of  Children  Under  18  and  Number  of  Children  Under  18  Living   with  Respondent  for  Student  Survey  Data  (n  =  153)  ....................................................................  27   Table  8.  General  Satisfaction  Measures  (n  =  153)  .........................................................................  29   Table  9.  Student  Perception  on  Indicators  of  Institutional  Quality  (n  =  153)   ................................  30   Table  10.  Student  Ratings  of  Barriers  to  Retention  (n  =  153)  .......................................................  31   Table  11.  Correlation  Between  Barriers  to  Retention  and  Course  Completion  and  Self-­‐Reported   Expectation  to  Continue  College  (n  =  153)  ....................................................................................  32   Table  12.  Comparison  of  the  Characteristics  of  the  Control  and  Innovation  Groups  for  Open   Entry/Exit  Math  Labs  .....................................................................................................................  33   Table  13.  Results  From  t-­‐Tests  Comparing  the  Performance  of  Control  Group  to  Innovation   Group  for  Course  Completion  and  Term  GPA  for  Open  Entry/Exit  Math  Labs  .............................  34   Table  14.  Process  Measures  for  Open  Entry-­‐Exit  Math  Labs  (n  =  7;  ACC  =  1,  PPCC  =  6,  TSJC  =  0)   36   Table  15.  Overview  of  Math  Labs  ..................................................................................................  38   Table  16.  Comparison  of  the  Characteristics  of  the  Control  and  Innovation  Groups  for   Accelerated,  Compressed,  Contextualized  and  Mainstreaming  ...................................................  39   Table  17.  Results  From  t-­‐Tests  Comparing  the  Performance  of  Control  Group  to  Innovation   Group  for  Course  Completion  and  Term  GPA  for  Accelerated,  Compressed,  Contextualized  and   Mainstreaming  ..............................................................................................................................  40   Table  18.  Process  Measures  for  Accelerated,  Compressed,  Contextualized  and  Mainstreaming   Courses  (n  =  5;  FRCC  =  5,  LCC  =  0)  .................................................................................................  41   Table  19.  Overview  of  Accelerated,  Compressed,  Contextualized  and  Mainstreaming  ...............  44     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   3  
  • 5. Evaluation  of  the  Completion  Innovation  Challenge  Grant       Table  20.  Comparison  of  the  Characteristics  of  the  Control  and  Innovation  Groups  for  Online   Hybrid  Courses  ..............................................................................................................................  45   Table  21.  Results  From  t-­‐Tests  Comparing  the  Performance  of  Control  Group  to  Innovation   Group  for  Course  Completion  and  Term  GPA  for  Online  Hybrid  Courses  .....................................  46   Table  22.  Overview  of  Online  Hybrid  Classes  ................................................................................  47   Table  23.  Comparison  of  the  Characteristics  of  the  Control  and  Innovation  Groups  for   Modularization  and  Diagnostic  Assessments  ................................................................................  48   Table  24.  Results  From  t-­‐Tests  Comparing  the  Performance  of  Control  Group  to  Innovation   Group  for  Course  Completion  and  Term  GPA  for  Modularization  and  Diagnostic  Assessments  ..  49   Table  25.  Process  Measures  for  Modularization  and  Diagnostic  Assessments  (n  =  3;  MCC  =  1,  NJC   =  0,  PCC  =  2)  ..................................................................................................................................  50   Table  26.  Overview  of  Modularization  and  Diagnostic  Assessments  ............................................  52   Table  27.  Overview  of  All  Innovation  Clusters  ..............................................................................  53       Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   4  
  • 6. Evaluation  of  the  Completion  Innovation  Challenge  Grant       Executive  Summary   The  Colorado  Department  of  Higher  Education  (CDHE)  received  a  Complete  College  America   (CCA)  grant  to  fund  the  Completion  Innovation  Challenge  Grant  (CICG)  project.  The  CCC  project   is  operated  by  the  Colorado  Community  College  System  (CCCS)  and  seeks  to  improve  college   completion  rates  within  CCCS  by  aligning  developmental  education  (DE)  courses  with  innovative,   evidence-­‐based  strategies  (innovations)  and  by  initiating  policy  reforms  that  ensure  the  state   financially  rewards  institutions  that  successfully  increase  the  number  of  college  graduates.     This  evaluation  attempts  to  answer  the  following  research  questions:   n n n n Were  the  innovations  implemented  as  intended?   What  can  the  colleges  and  CCCS  learn  from  the  implementation  of  the  seven   innovations?   Are   students   within   innovation   DE   programs   more   successful   (in   terms   of   graduation,  retention  and  GPA)  than  those  in  standard  DE  programs?   Which  innovations  are  the  most  successful  (in  terms  of  graduation,  retention   and  GPA)?   This  report  summarizes  the  methodology  of  this  evaluation  and  the  findings  to  date,  which   includes  data  from  the  first  semester  of  implementation  (spring  2012).  A  second  report  will  be   produced  in  August  of  2013  and  will  include  data  from  the  first  three  semesters  of   implementation  (spring  2012  through  spring  2013).    Evaluation  will  continue  beyond  the  spring   of  2013,  though  at  this  time  it  is  not  entirely  clear  what  form  this  evaluation  will  take.1     Innovations   As  part  of  the  CICG  project,  seven  innovations  in  developmental  education  are  being   implemented  at  12  colleges  within  the  CCCS  system  (see  the  full  innovations  section  below  for  a   description  of  each):   n Open  Entry/Exit  Math  Labs   n Mainstreaming   n Accelerated  and  Compressed   n Contextualization   n Modularization   n Diagnostic  Assessment     n Online  Hybrid  Courses  for  Developmental  Education                                                                                                                           1  The  CCA  grant  that  funds  these  innovations  and  their  evaluation  will  not  fund  third-­‐party  evaluation  beyond  the   spring  of  2013.  However,  JVA  will  work  with  CCCS  to  ensure  evaluation  continues  in  some  form  beyond  this  time.     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   5  
  • 7. Evaluation  of  the  Completion  Innovation  Challenge  Grant       Some  of  these  innovations  are  being  implemented  as  stand-­‐alone  innovations,  while  others  are   being  implemented  in  combination.  Additionally,  several  innovations  closely  overlap  in  practice.     While  there  were  not  sufficient  data  available  to  robustly  investigate  each  institution   separately,  this  study  investigates  the  above  innovations  in  four  distinct  innovation  clusters   (see  the  full  innovations  section  below  for  a  description  of  each):   n Open  Entry/Exit  Math  Labs   n Accelerated,  Compressed,  Contextualized  and  Mainstreaming   n Online  Hybrid     n Modularization  and  Diagnostic  Assessments   Though  there  is  some  variation  within  each  of  these  clusters,  for  analytical  purposes,  they  are   treated  as  distinct  and  mutually  exclusive  sets  of  innovative  strategies.    The  institutions  within   each  cluster  are  presented  below.   Open  Entry/Exit  Math  Labs   Three  institutions  implemented  open  entry/exit  math  labs  as  part  of  the  CCC  project:   n Arapahoe  Community  College  (open  entry/exit  math  labs)   n Pikes  Peak  Community  College  (open  entry  math  labs)     n Trinidad  State  Junior  College  (open  entry/exit  math  labs)   Accelerated,  Compressed,  Contextualization  and  Mainstreaming   Four  institutions  implemented  accelerated,  compressed,  contextualized  and/or  mainstreaming   efforts  as  part  of  the  CCC  project:   n n Community  College  of  Aurora  (accelerated,  compressed  and  mainstreaming)   Community   College   of   Denver   (accelerated,   compressed,   mainstreaming   and   contextualized)   n Front  Range  Community  College  (accelerated  and  compressed)     n Lamar  Community  College  (accelerated  and  compressed)   Online  Hybrid  Courses   Two  institutions  implemented  online  hybrid  courses  as  part  of  the  CCC  project:   n n   Colorado  Community  College  Online  (online  hybrid  courses)     Otero  Junior  College  (online  hybrid  courses)   Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   6  
  • 8. Evaluation  of  the  Completion  Innovation  Challenge  Grant       Modularization  and  Diagnostic  Assessments   Three  institutions  implemented  modularization  and  diagnostic  assessments  as  part  of  the  CCC   project:   n Morgan  Community  College  (diagnostic  assessments  and  math  mods)     n Northeastern  Junior  College  (diagnostic  assessments  and  math  mods)   n Pueblo  Community  College  (diagnostic  assessments  and  math  mods)   Methodology   To  answer  the  research  questions,  a  survey  was  administered  to  students  to  support   institutional  data  derived  from  the  Student  Unit  Record  Data  System  (SURDS).    Additionally,  a   faculty  survey  was  administered  and  interviews  were  conducted  with  key  faculty  members.  The   sections  below  discuss  each  of  these  data  sources  in  more  detail,  as  well  as  how  the  control   groups  were  constructed  and  the  limitations  of  this  evaluation.   Data  Sources   The  data  used  in  this  study  were  gathered  from  four  sources:   n n n n CCCS   institutional   data—demographics,  grades  and  course  completion  variables   from  Student  Unit  Record  Data  System  (SURDS).   Student  survey—an   electronic   survey   designed   to   ascertain   student   satisfaction   with   DE   programming   and   to   identify   challenges   DE   students   experience   that   may  act  as  barriers  to  graduation  (see  Appendix  A).   Faculty  survey—an  electronic  survey  designed  to  ascertain  the  degree  to  which   faculty/staff   members   feel   each   innovation   is   being   implemented   as   intended   and  faculty  perception  of  the  quality  of  the  innovations  (see  Appendix  B).   Faculty   interviews—phone   interviews   lasting   approximately   15–30   minutes   with   16   key   faculty   and   staff   members   to   ascertain   the   degree   to   which   each   innovation  is  being  implemented  as  intended,  what  is  going  well  and  what  could   be  improved  upon  (see  Appendix  C).   Control  Group   To  build  control  groups,  students  in  traditional-­‐format  DE  courses  were  identified  and  matched   by  institution  and  course—for  each  innovation  course,  a  corresponding  traditional  course  at  the   same  institution  was  identified.  When  this  was  not  possible,  a  course  at  a  similar  institution   (similar  in  terms  of  size  and  rural/urban  location)  was  identified.  This  process  ensured  that,   whenever  possible,  innovation  courses  were  matched  to  control  courses  at  the  same  institution.   As  such,  institutionally  specific  variables  were  controlled  as  much  as  possible.  Finally,  within   each  innovation  cluster,  control  groups  were  matched  to  the  innovation  groups  along  four     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   7  
  • 9. Evaluation  of  the  Completion  Innovation  Challenge  Grant       demographic  factors:  (1)  gender,  (2)  ethnicity,  (3)  race2  and  (4)  age.  In  the  findings  section   below,  the  relative  match  between  control  and  innovation  groups  is  identified  for  each   innovation  cluster.   Study  Limitations   Though  this  evaluation  provides  valuable  information  on  the  CCC  program,  it  suffers  from  some   limitations:     n n A   mismatch   between   the   time   horizon   of   the   study   and   the   desired   outcomes— college  retention  is  a  long-­‐term  measure  that  will  most   effectively   be   measured   over  a  longer  period  of  time.     The   ambiguity   contained   within   definitions   of   these   innovations—institutions   define  and  implement  the  same  innovations  somewhat  differently.     n An  inability  to  make  distinctions  between  similar  innovations  within  clusters.     n Generally  small  sample  sizes  limit  the  generalizability  of  these  findings.     n Not  all  of  the  potential  benefits  associated  with  these  innovations  are  measured   by  this  evaluation.     Despite  these  limitations,  this  evaluation  provides  valuable  information  on  the  progress  made   by  the  CICG  project.    Though  these  findings  cannot  be  considered  conclusive,  they  do  provide  a   sense  of  how  the  project  has  progressed  and  what  it  has  accomplished  thus  far.   Findings   Findings  are  presented  in  six  sections  below:  (1)  student  demographics,  (2)  student  experience,   (3)  math  lab  innovation  cluster,  (4)  accelerated,  compressed,  contextualized  and  mainstreaming   innovation  cluster,  (5)  online  hybrid  innovation  cluster,  and  (6)  modularization  and  diagnostic   assessment  innovation  cluster.   Student  Demographics   Student  demographic  data  for  this  study  are  from  two  sources:  (1)  institutional  data  and  (2)  the   student  survey.  Data  from  each  of  these  sources  are  presented  below:   n n n Gender—more  than  half  (55%)  of  the  entire  sample  is  female  and  just  over  two-­‐ thirds  (70%)  of  survey  respondents  are  female.   Ethnicity—roughly   one-­‐fifth   (20.3%)   of   the   entire   sample   identifies   as   Latino,   as   did  a  slightly  smaller  proportion  of  survey  respondents  (17.0%).   Race—almost   three-­‐fifths   (58%)   of   the   entire   sample   identifies   as   white,   and   just  over  one-­‐fifth  (22%)  did  not  identify  as  any  of  the  available  racial  categories.                                                                                                                             2  In  these  data,  ethnicity  is  treated  as  a  separate  concept  from  race.  Ethnicity  consists  of  Latino/non-­‐Latino  and  race   consists  of  five  separate  racial  categories.     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   8  
  • 10. Evaluation  of  the  Completion  Innovation  Challenge  Grant       Similarly,   almost   two-­‐thirds   (65%)   of   survey   respondents   identify   as   white,   which  is  higher  than  the  sample  as  a  whole.  Additionally,  16%  did  not  identify  as   any  of  the  available  racial  categories,  which  is  lower  than  the  sample  as  a  whole.     n Age—the   mean   age   for   the   sample   as   a   whole   is   28.06   (SD   =   9.676),   ranging   from  17  years  old  to  72  years  old.  Survey  respondents  are  slightly  older  with  a   mean  age  of  31  (SD  =  11.256).   Student  Experience   A  student  survey  was  administered  to  get  a  sense  of  the  student  experience,  including  student   satisfaction  with  DE  programming  and  challenges  DE  students  experience  that  may  act  as   barriers  to  graduation.    Though  these  results  contain  useful  findings,  the  sample  is  too  small  to   be  confident  that  it  is  fully  representative  of  all  the  students  in  this  study.3    As  such,  extreme   caution  should  be  taken  when  reading  these  results,  as  they  may  not  generalizable  to  the   population  at-­‐large  (i.e.  all  students  in  the  study).   The  student  survey  suggests  satisfaction  is  relatively  high  among  CCCS  students,  with  just  over   four-­‐fifths  (81.6%)  of  survey  respondents  indicating  they  were  either  satisfied  or  very  satisfied   with  their  college  experience.  Additionally,  95.2%  of  survey  respondents  indicated  that  their   college  experience  met  or  exceeded  their  expectations  and  almost  two-­‐thirds  (72.6%)  indicated   that  they  plan  on  graduating  from  the  college  they  are  attending,  while  just  over  half  (53.5%)   indicated  that  they  plan  on  transferring  to  a  different  college.  When  results  from  these  two   questions  are  combined,  the  data  show  that  91.6%  of  respondents  indicated  that  they  either   plan  on  graduating  from  the  college  they  are  in,  and/or  they  plan  on  transferring  to  a  different   college.  Thus,  at  this  point,  8.4%  of  survey  respondents  do  not  anticipate  progressing  through   the  system  to  degree  completion.   In  addition  to  the  satisfaction  measures  addressed  above,  students  were  asked  to  agree  or   disagree  with  a  set  of  statements  related  to  institutional  quality.  These  data  suggest  that   student  perception  of  institutional  quality  is  generally  high.  Indeed,  on  a  five-­‐point  Likert-­‐type   scale  where  1  =  “Strongly  disagree”  and  5  =  “Strongly  agree,”  for  all  but  three  items,  mean   scores  were  above  4  (or  Agree)  and  more  than  80%  of  respondents  agreed  or  strongly  agreed   with  the  statements.  Further,  the  remaining  items  had  mean  scores  above  3  (or  the  neutral   point)  indicating  more  agreement  than  disagreement.     The  student  survey  also  asked  students  to  indicate  the  extent  to  which  certain  circumstances   were  barriers  to  their  ability  and/or  willingness  to  attend  school  next  semester.  Responses  were                                                                                                                           3  The  margin  of  error  for  this  sample  (153  from  a  population  of  1,527)  is  7.52%  at  a  95%   confidence  level.    To  attain  a  more  generally  acceptable  margin  of  error  of  5%  while  retaining  a   95%  confidence  level,  a  sample  of  308  would  have  been  needed.     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   9  
  • 11. Evaluation  of  the  Completion  Innovation  Challenge  Grant       on  a  four-­‐point  Likert-­‐type  scale  where  1  =  “Not  a  barrier,”  2  =  “Somewhat  of  a  barrier,”  3  =   “Moderate  barrier”  and  4  =  “Extreme  barrier.”  As  demonstrated  by  the  mean  scores  (with  only   one  item  exceeding  a  mean  score  of  2,  or  somewhat  of  a  barrier),  respondents  do  not  seem  see   these  items  as  overwhelming  barriers  to  their  ability  to  continue  with  school  next  semester.       Additionally,  correlations4  were  run  with  these  barriers  and  both  the  course  completion  ratio   (ratio  of  DE  courses  passed  over  those  attempted)  and  self-­‐reported  continuance  (respondent   indicating  either  an  intent  to  graduate  and/or  transfer  to  other  school).  These  data  indicated   that  there  is  no  correlation  between  a  student’s  perception  of  each  barrier  and  whether  or  not   he  or  she  expects  to  graduate  or  transfer  to  another  college.  However,  there  are  correlations   between  student  perception  of  barriers  and  their  course  completion  ratio.  In  particular,  the   following  barriers  are  significantly  negatively  correlated  with  course  completion:   n Amount  of  time  required     n Difficulty  of  the  classes     n Navigating  the  administration     n The  lack  of  a  social  scene     n The  school’s  fit  with  my  academic  needs     n Cost  of  school     In  other  words,  as  student  perception  of  each  of  the  above  barriers  rises,  the  likelihood  that  he   or  she  passes  his  or  her  DE  courses  drops.  Yet,  there  is  no  such  correlation  between  student   perception  of  these  barriers  and  their  self-­‐reported  expectation  to  continue  with  college.  This   suggests  that  all  of  the  barriers  listed  in  the  bullet  points  above  impact  student  performance   (as  measured  by  DE  course  completion),  but  that  the  barriers  do  not  impact  student   expectations  regarding  graduation  or  transfer.   Open  Entry/Exit  Math  Labs   (ACC,  PPCC  and  TSJC)   Below  (Table  1)  is  a  summary  of  findings  for  the  math  lab  innovation  cluster  (for  more  complete   findings,  see  the  full  Open  Entry/Exit  Math  Labs  section  below).                                                                                                                           4  The  Pearson  product-­‐moment  correlation  coefficient  is  a  measure  of  the  relationship  between  two  variables;  in   other  words,  a  measure  of  the  tendency  of  the  variables  to  increase  or  decrease  together.       Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   10  
  • 12. Evaluation  of  the  Completion  Innovation  Challenge  Grant         Table  1.  Overview  of  Math  Labs   Item   Performance   Course  Completion  Over  Control   Significantly  Lower   Term  GPA  Over  Control   Perception  of  Innovation  Quality  (Faculty)   No  Significant  Difference   About  the  Same   Desire  to  Continue  Innovation  (Faculty)   Yes   Implemented  as  Intended  (Faculty  Perception)   Yes   Key  Contextual  Notes     Positive  Developments  in  Implementation   • Increases  flexibility  for  students     • Allows  appropriate  pace  (not  necessarily  faster)     • Mastery  of  the  subject  matter  (not  just  pass)   • More  friendly  for  some  older  students     • Reduces  point-­‐in-­‐time  student-­‐to-­‐teacher  ratios     Ongoing  Challenges  in  Implementation   • Different  facility  requirements     • Increased  administrative  complexity     • Increased  complexity  for  instructors     • Insufficient  time  management  (on  the  part  of  students)     • “Appropriate  pace”  ≠  faster     Start-­‐Up  Growing  Pains   • Messaging  issues     • Insufficient  training       Accelerated,  Compressed,  Contextualization  and  Mainstreaming   (CCA,  CCD,  FRCC  and  LCC)   Table  2  below  summarizes  the  findings  for  the  accelerated,  compressed,  contextualized  and   mainstreaming  innovation  cluster  (for  more  complete  findings,  see  the  full  Accelerated,   Compressed,  Contextualization  and  Mainstreaming  section  below).     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   11  
  • 13. Evaluation  of  the  Completion  Innovation  Challenge  Grant       Table  2.  Overview  of  Accelerated,  Compressed,  Contextualized  and  Mainstreaming   Item   Performance   Course  Completion  Over  Control   Term  GPA  Over  Control   No  Significant  Difference   Significantly  Higher   Perception  of  Innovation  Quality  (Faculty)   Better   Desire  to  Continue  Innovation  (Faculty)   Yes   Implemented  as  Intended  (Faculty  Perception)   Yes   Key  Contextual  Notes     Positive  Developments  in  Implementation   • Allows  students  to  progress  more  quickly     • Positively  impacts  student  motivation     • Contributes  to  an  improved  academic  culture     • Increases  student  autonomy     • Increases  curriculum  relevance     • Increases  student  engagement     • Facilitates  learning  across  subjects     Ongoing  Challenges  in  Implementation   • Students’  lack  of  desire  to  go  faster     • Students’  lack  of  ability     • Complexity  of  administrative  logistics     • Less  room  to  adjust  to  unforeseen  issues     • Finding  the  appropriate  pace     • Students’  need  for  additional  support     • Occasional  tension  between  contextual  projects  and  basic  content   Start-­‐Up  Growing  Pains   • Messaging  issues     • Insufficient  training     • Time  constraints       Online  Hybrid  Courses   (CCCOnline  and  OJC)   Table  3  below  summarizes  the  findings  for  the  online  hybrid  innovation  cluster  (for  more   complete  findings,  see  the  full  Online  Hybrid  Courses  section  below).     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   12  
  • 14. Evaluation  of  the  Completion  Innovation  Challenge  Grant       Table  3.  Overview  of  Online  Hybrid  Classes   Item   Performance   Course  Completion  Over  Control   No  Significant  Difference   Term  GPA  Over  Control   No  Significant  Difference   Perception  of  Innovation  Quality  (Faculty)   No  Data   Desire  to  Continue  Innovation  (Faculty)   No  Data   Implemented  as  Intended  (Faculty  Perception)   No  Data   Key  Contextual  Notes   Positive  Developments  in  Implementation   • Adds  a  “personal  touch”  to  online  courses     • Expands  tutoring  within  CCCOnline     • Awareness  was  established     • Access  was  provided     Start-­‐Up  Growing  Pains   • Insufficient  program  definition     • Messaging  issues     • Lack  of  integration     • OJCs  largely  not  utilized           Modularization  and  Diagnostic  Assessments   (MCC,  NJC  and  PCC)   Table  4  below  summarizes  the  findings  for  the  modularization  and  diagnostic  assessments   innovation  cluster  (for  more  complete  findings,  see  the  full  Modularization  and  Diagnostic   Assessments  section  below).     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   13  
  • 15. Evaluation  of  the  Completion  Innovation  Challenge  Grant       Table  4.  Overview  of  Modularization  and  Diagnostic  Assessments   Item   Performance   Course  Completion  Over  Control   No  Significant  Difference   Term  GPA  Over  Control   No  Significant  Difference   Perception  of  Innovation  Quality  (Faculty)   Better   Desire  to  Continue  Innovation  (Faculty)   Yes   Implemented  as  Intended  (Faculty  Perception)   Yes   Key  Contextual  Notes     Positive  Developments  in  Implementation   • Appropriate  pace     • Mastery  of  the  subject  matter     • Shorter  remediation  track     • Instant  feedback     • Appropriate  placement     Challenges  in  Implementation   • Increased  administrative  complexity     • Perception  that  students  are  “teaching  themselves”     • Lack  of  computer  skills     • Diagnostic  testing  ≠  shorter  remediation  track     Start-­‐Up  Growing  Pains   • Messaging  issues     Conclusion  in  Executive  Summary   These  data  go  some  distance  in  answering  outcome  related  research  questions:   • Are  students  within  innovation  DE  programs  more  successful  (in  terms  of  graduation,   retention  and  GPA)  than  those  in  standard  DE  programs?   It  is  premature  to  fully  answer  this  question,  but  thus  far  there  is  not  strong  evidence   to  suggest  that  innovation  formats  are  outperforming  traditional  formats  in  terms  of   retention  and  GPA.  This  is  not  entirely  surprising  as  these  measures  are  largely  long-­‐ term  measures,  and  CCCS  institutions  are  still  in  the  initial  stages  of  the   implementation  of  these  innovations.  Additionally,  it  appears  that  some  innovations   provide  benefits  to  students  that  are  not  objectively  measured  by  this  evaluation.     • Which  innovations  are  the  most  successful  (in  terms  of  graduation,  retention,  and   GPA)?   At  this  point  in  the  evaluation,  the  accelerated,  compressed,  contextualized  and   mainstreaming  innovation  cluster  is  outperforming  the  other  innovations  in  terms  of   retention  and  GPA.       Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   14  
  • 16. Evaluation  of  the  Completion  Innovation  Challenge  Grant       Additionally,  these  data  address  the  following  process  related  research  questions:     • Were  the  innovations  implemented  as  intended?   Despite  some  initial  hurdles,  and  with  a  few  exceptions,  these  innovations  are  being   implemented  largely  as  originally  intended.   • What  can  the  colleges  and  CCCS  learn  from  the  implementation  of  the  seven   innovations?   The  evaluation  of  the  first  semester  of  the  implementation  of  the  CICG  project  has   uncovered  a  variety  of  important  lessons:   • Messaging  is  important   • Appropriate  pace  ≠  faster  pace     • There  are  unanticipated  benefits  to  some  of  these  innovations   • New  formats  are  resource  intensive  to  set  up   • New  formats  have  a  learning  curve   • Innovations  are  not  necessarily  replacements  for  a  traditional  format     Additionally,  several  potential  barriers  to  retention  not  related  to  these  innovations   emerged  as  significantly  correlated  with  course  completion  (though  not  with   respondents’  expectations  for  graduation  or  transfer).     These  findings  are  preliminary,  and  it  is  far  too  early  to  make  any  conclusive  judgments  about   the  success  of  the  innovations  implemented  as  part  of  the  CCC  project.  Such  judgments  will   come  later  as  data  are  collected  over  a  longer  period  of  time  and  these  innovations  mature.   However,  the  data  collected  to  date  suggest  that  these  innovations  provide  a  benefit  to   students  and  should  continue  to  be  implemented.  Despite  the  benefits,  however,  these   innovations  are  unlikely  to  be  a  panacea  for  the  challenges  faced  by  DE.       Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   15  
  • 17. Evaluation  of  the  Completion  Innovation  Challenge  Grant       Introduction  and  Background   The  Colorado  Department  of  Higher  Education  (CDHE)  received  a  Complete  College  America   (CCA)  grant  to  fund  the  Completion  Innovation  Challenge  Grant  (CICG)  project.  The  CICG  project   is  operated  by  the  Colorado  Community  College  System  (CCCS)  and  seeks  to  improve  college   completion  rates  within  CCCS  by  aligning  developmental  education  (DE)  courses  with  innovative,   evidence-­‐based  strategies  (innovations)  and  by  initiating  policy  reforms  that  ensure  the  state   financially  rewards  institutions  that  successfully  increase  the  number  of  college  graduates.     CDHE  and  CCCS  contracted  with  JVA  Consulting,  LLC  (JVA)  to  act  as  a  third  party  evaluator  for   the  innovation  portion  of  this  project.  This  evaluation  attempts  to  answer  the  following  research   questions:   • Were  the  innovations  implemented  as  intended?   • What  can  the  colleges  and  CCCS  learn  from  the  implementation  of  the  seven   innovations?   • Are  students  within  innovation  DE  programs  more  successful  (in  terms  of  graduation,   retention  and  GPA)  than  those  in  standard  DE  programs?   • Which  innovations  are  the  most  successful  (in  terms  of  graduation,  retention  and   GPA)?   This  report  summarizes  the  methodology  of  this  evaluation,  and  the  findings  to  date,  which   includes  data  from  the  first  semester  of  implementation  (spring  2012).  A  second  report  will  be   produced  in  August  of  2013  and  will  include  data  from  the  first  three  semesters  of   implementation  (spring  2012  through  spring  2013).    Evaluation  will  continue  beyond  the  spring   of  2013,  though  at  this  time  it  is  not  entirely  clear  what  form  this  evaluation  will  take.5     This  report  is  organized  around  four  major  sections  (1)  Introduction  and  Background,  (2)   Methodology,  (3)  Findings  and  (4)  Conclusion.  The  Introduction  and  Background  section  (this   section)  introduces  the  CICG  project  with  a  focus  on  the  need  for  the  project,  the  innovations   implemented  and  the  institutions  involved.  The  methodology  section  discusses  the  overall   design  of  the  evaluation,  each  of  the  data  sources,  the  analysis,  limitations  of  the  data  and  steps   taken  to  protect  study  participants.  The  Findings  section  summarizes  the  key  findings  from  this   study,  focusing  on  four  areas:  student  demographics,  student  experience,  process  evaluation   (were  the  innovations  implemented  as  intended?)  and  outcome  evaluation  (how  successful   were  the  innovations?).                                                                                                                           5  The  CCA  grant  that  funds  these  innovations  and  their  evaluation  will  not  fund  third-­‐party  evaluation  beyond  the   spring  of  2013.  However,  JVA  will  work  with  CCCS  to  ensure  evaluation  continues  in  some  form  beyond  this  time.     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   16  
  • 18. Evaluation  of  the  Completion  Innovation  Challenge  Grant       The  Need  for  the  CICG  Project   Students  referred  to  DE  courses  are  at  risk  of  failing  to  complete  their  degree—under  the   current  circumstances,  half  will  not  even  complete  their  developmental  sequence.6  In  2009,  29%   of  Colorado’s  college  students  required  remediation  in  reading,  writing  or  mathematics,  and   over  half  (53%)  of  students  attending  two-­‐year  institutions  needed  remediation.  At  current   rates,  of  100  students  enrolled  in  the  lowest  level  of  developmental  math,  only  four  will   graduate.     In  response  to  this  need,  the  Higher  Education  Strategic  Planning  Steering  Committee  identified   remediation  redesign  as  a  top  priority  for  Colorado,7  and  the  Governor’s  Office  and  its  partners,   the  Colorado  Commission  on  Higher  Education  (CCHE),  the  Colorado  Department  of  Higher   Education  (CDHE)  and  the  Colorado  Community  College  System  (CCCS)  propose  to  increase  the   number  of  college  graduates  while  reducing  time  to  completion  by  transforming  the  delivery  of   DE.  Thus,  the  CICG  project  is  aligned  with  a  larger  statewide  effort  to  improve  retention  among   students  referred  to  DE  courses.   Innovations   As  part  of  the  CICG  project,  seven  innovations  in  developmental  education  are  being   implemented  at  12  colleges  within  the  CCCS  system:   n n n Open   Entry/Exit   Math   Labs—open   entry/exit   math   labs   offer   developmental   math   courses   that   allow   students   to   work   at   their   own   pace   and   to   test   independently,  while  making  math  mentors  available  to  students  as  needed.   Mainstreaming—mainstreaming   refers   to   an   approach   that   allows   students   who   test   at   the   upper   range   of   developmental   education   to   enroll   in   college   level  courses  with  one  additional  credit  hour  to  allow  them  time  to  strengthen   their  foundational  skills.   Accelerated   and   Compressed—accelerated   courses   alter   the   scheduling   of   developmental   education   such   that   students   can   complete   required   courses   faster  than  the  traditional  semester  sequence.  A  compressed  format  (e.g.,  five-­‐ week  courses)  is  one  type  of  accelerated  course,  though  there  are  others  (e.g.,   combined  formats  where  030  and  060  courses  are  instructed  concurrently  in  the   same  semester).                                                                                                                           6  Bailey,  T.,  Jeong,  D.,  &  Sung-­‐Woo,  C.  (2009).  Referral,  enrollment,  and  completion  in  developmental  education   sequences  in  community  colleges.  New  York:  Community  College  Research  Center,  Teachers  College,  Columbia   University.   7  Colorado  Department  of  Higher  Education  (2010).  The  degree  dividend:  Building  our  economy  and  preserving  our   quality  of  life:  Colorado  must  decide.  Colorado’s  Strategic  Plan  for  Higher  Education.     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   17  
  • 19. Evaluation  of  the  Completion  Innovation  Challenge  Grant       n n n n n Contextualization—contextualized   courses   embed   developmental   education   within   the   context   of   program   specific   content.   Contextualized   courses   either:   (1)   relate   developmental   competencies   to   career/technical   education   competencies,   or   (2)   pair   developmental   education   courses   with   college   level   courses.   Mainstreaming—mainstreaming   refers   to   an   approach   that   allows   students   to   enroll  in  college-­‐level  courses  with  additional  credit  hours  to  allow  them  time  to   strengthen   their   foundational   skills   and   meet   developmental   education   requirements.     Modularization—modularization  refers  to  the  reorganization  of  developmental   education  courses  into  distinct  stand-­‐alone  modules  (or  mods)  that  can  be  taken   in  a  variety  of  combinations.  Currently,  modularization  is  only  available  for  math   courses.   Diagnostic   Assessment—diagnostic   assessment   refers   to   a   pretest   used   to   determine   the   appropriate   placement   of   students   based   on   the   requirements   for   entrance   into   their   degree   program.   Currently,   diagnostic   assessment   is   being   paired   with   modular   math   to   help   determine   the   appropriate   mods   for   students  to  ensure  they  meet  the  requirements  of  their  degree  program.   Online   Hybrid   Courses   for   Developmental   Education—these   innovations   combine   elements   of   traditional   formats   with   online   classes.   In   particular,   live   tutors  are  made  available  to  students  taking  online  courses.   Some  of  these  innovations  are  being  implemented  as  stand-­‐alone  innovations,  while  others  are   being  implemented  in  combination.  Additionally,  several  innovations  closely  overlap  in  practice.     While  there  were  not  sufficient  data  available  to  robustly  investigate  each  institution   separately,  this  study  investigates  the  above  innovations  in  four  distinct  innovation  clusters:   n n n n Open   Entry/Exit   Math   Labs—though   the   precise   meaning   of   “open”   differs   among   institutions,   math   labs   are   implemented   consistently   enough   across   CCCS  institutions  to  treat  them  as  a  distinct  group.   Accelerated,   Compressed,   Contextualized   and   Mainstreaming—based   on   faculty   interviews,   it   appears   that   in   practice   these   innovations   overlap   substantially   within   CCCS   institutions.   Thus,   while   they   are   technically   distinct   innovations,  they  are  clustered  together  for  analysis.   Online   Hybrid—though   the   form   of   online   hybrid   courses   differs,   they   are   similar  enough  to  be  treated  as  a  single  entity.     Modularization   and   Diagnostic   Assessments—one   of   the   three   institutions   implementing   modular   math   is   not   using   diagnostic   assessments.   However,   these  innovations  are  similar  enough  to  be  treated  as  a  single  cluster.     Though  there  is  some  variation  within  each  of  these  clusters,  for  analytical  purposes,  they  are   treated  as  distinct  and  mutually  exclusive  sets  of  innovative  strategies.    To  get  a  better  sense   of  the  variation  within  each  cluster,  descriptions  of  the  specific  innovation  strategies   implemented  by  each  institution  are  present  for  each  cluster  below.     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   18  
  • 20. Evaluation  of  the  Completion  Innovation  Challenge  Grant       Open  Entry/Exit  Math  Labs   Three  institutions  implemented  open  entry/exit  math  labs  as  part  of  the  CCC  project:   n n n Arapahoe   Community   College   (open   entry/exit   math   labs)—at   Arapahoe   Community   College   (ACC),   developmental   math   courses   offered   in   a   math   lab   format  are  referred  to  as  FLEX  classes.    FLEX  classes  attempt  to  provide  students   with  the  flexibility  to  decide  when  and  where  they  work,  though  the  format  is   not  entirely  self-­‐paced  as  deadlines  are  provided  (but  students  can  work  faster  if   desired).     In   FLEX   classes,   students   complete   their   homework   online,   but   complete   exams   on   campus.     Additionally,   students   in   FLEX   courses   have   access   to  the  FLEX  Lab  for  face-­‐to-­‐face  tutoring  and  support.   Pikes   Peak   Community   College   (open   entry   math   labs)—at   Pikes   Peak   Community   College   (PPCC),   math   labs   are   open   entry,   but   not   open   exit.     This   format   allow   students   to   work   at   their   own   pace   and   to   come   to   the   lab   as   needed,   where   they   can   access   tutors   and   resources   such   as   practice   tests,   graphing   calculators   or   instructional   DVDs.     These   math   labs   are   also   where   students  go  to  take  their  proctored  tests.     Trinidad   State   Junior   College   (open   entry/exit   math   labs)—at   Trinidad   State   Junior   College   (TSJC),   math   labs   provide   self-­‐paced   instruction   incorporating   both   the   MyMathLab   program   and   more   traditional   paper-­‐pencil   instruction.     Students  are  provided  deadlines  to  complete  their  courses,  but  are  able  to  flex   their  time  within  set  time  blocks.   Accelerated,  Compressed,  Contextualization  and  Mainstreaming   Four  institutions  implemented  accelerated,  compressed,  contextualized  and/or  mainstreaming   efforts  as  part  of  the  CCC  project:   n n   Community  College  of  Aurora  (accelerated,  compressed  and  mainstreaming)— the   Community   College   of   Aurora   (CCA)   provides   a   form   of   accelerated   and   compressed   courses   in   which   two   developmental   math   courses   are   combined   into   one,   allowing   students   to   complete   their   developmental   requirements   in   fifteen   weeks   instead   of   thirty   weeks.     To   support   students   working   at   this   accelerated   pace,   CCA   provides   extra   tutoring   opportunities   and   requires   students   to   attend   a   minimum   amount   of   tutoring.     Additionally,   CCA   is   experimenting   with   some   mainstreaming   efforts   in   which   students   who   would   normally  be  assigned  to  a  developmental  reading  course  (REA  090)  are  able  to   meet  these  requirements  within  a  college  level  course  (BIO  111).   Community   College   of   Denver   (accelerated,   compressed,   mainstreaming   and   contextualized)—at   the   Community   College   of   Denver   (CCD)   the   FastStart   program  combines  accelerated,  compressed  and  mainstreaming  approaches  to   allow   students   to   complete   their   developmental   requirements   more   quickly.     FastStart  allows  students  to  complete  two  levels  of  classes  in  a  single  semester,   or   to   combine   higher   developmental   education   courses   with   college   level   courses   (mainstreaming).     In   addition   to   FastStart,   CCD   students   are   able   to   participate   in   learning   communities   where   they   spend   an   hour   per   week   with   their  peers  and  the  instructor.    Finally,  CCD  offers  a  contextualization  option  in   Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   19  
  • 21. Evaluation  of  the  Completion  Innovation  Challenge  Grant       which  students  apply  the  skills  they  learn  in  their  courses  to  develop  a  business   plan  over  the  course  of  the  semester.   n n Front   Range   Community   College   (accelerated   and   compressed)—Front   Range   Community  College  (FRCC)  initially  intended  to  engage  in  mainstreaming  efforts,   but  latter  shifted  to  an  accelerated  and  compressed  format  that  it  implemented   at   its   Westminster   campus.     To   build   on   this   effort,   FRCC   will   implement   what   was   developed   at   the   Westminster   campus   at   the   Longmont   campus   to   allow   students   from   multiple   campuses   (Longmont,   Greely   and   Fort   Collins)   to   take   advantage  of  the  program.     Lamar   Community   College   (accelerated   and   compressed)—at   Lamar   Community  College  (LCC),   developmental  education  is  offered  in  a  compressed   format,   which   combines   two   classes   in   to   one.     This   shortens   the   remediation   track   and   allows   students   to   complete   their   developmental   education   requirements  more  quickly.   Online  Hybrid  Courses   Two  institutions  implemented  online  hybrid  courses  as  part  of  the  CCC  project:   n n Colorado   Community   College   Online   (online   hybrid   courses)—Colorado   Community   College   Online   (CCCOnline)   provides   online   courses   for   colleges   throughout   CCCS,   and   as   part   of   the   CICG   innovations   in   developmental   education,   added   additional   in   house   tutoring   services   for   developmental   English  and  math.    This  approach  is  intended  to  add  a  personal  touch  to  online   courses,   and   as   such,   to   combine   some   of   the   most   promising   elements   of   traditional  and  online  courses.   Otero   Junior   College   (online   hybrid   courses)—at   Otero   Junior   College   (OJC),   students  in  developmental  math  are  able  to  take  advantage  of  an  online  hybrid   format   by   combining   face-­‐to-­‐face   instruction   with   online   tutoring   services   offered  by  CCCOnline.   Modularization  and  Diagnostic  Assessments   Three  institutions  implemented  modularization  and  diagnostic  assessments:   n n   Morgan   Community   College   (diagnostic   assessments   and   math   mods)—at   Morgan   Community   College   (MCC),   students   take   the   ACCUPLACER   to   identify   their   appropriate   placement   within   the   MyFoundationsLab   program.     This   program  provides  online  activities  and  assessments,  with  an  interactive  guided   solution   and   sample   problem   for   each   exercise.     This   program   also   provides   students   with   a   variety   of   resources   including   video   lectures,   animations,   and   audio  files.     Northeastern   Junior   College   (diagnostic   assessments   and   math   mods)—at   Northeastern  Junior  College  (NJC),  all  developmental  math  has  been  converted   to  a  modular  format.    Students  take  the  ACCUPLACER  test  within  the  first  week   of   classes   to   identify   which   modules   are   most   appropriate   for   them.     Once   placed,   student’s   complete   modules   at   their   own   pace,   but   are   provided   timelines   to   guide   them   through   the   semester.     To   advance   through   the   Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   20  
  • 22. Evaluation  of  the  Completion  Innovation  Challenge  Grant       sequence   students   have   to   pass   tests.     Initially,   students   had   unlimited   opportunities   to   take   these   tests,   but   NJC   found   that   this   led   many   of   its   students   to   not   take   the   tests   seriously.     As   such,   students   now   have   three   opportunities  to  pass  each  test.   n Pueblo   Community   College   (diagnostic   assessments   and   math   mods)—at   Pueblo   Community   College   (PCC),   students   have   option   to   take   their   developmental  math  courses  in  modules  that  allow  them  to  work  at  their  own   pace   utilizing   online   math   software.     A   diagnostic   assessment   is   used   to   identify   the   competency   areas   in   which   students   have   not   demonstrated   mastery,   and   the   modules   that   are   associated   with   these   areas.     Though   the   course   itself   is   four   credit   hours,   over   the   course   of   a   semester,   students   can   complete   the   equivalent  of  up  to  13  credit  hours  worth  of  developmental  math  coursework.   Methodology   This  study  has  two  design  components:  (1)  an  outcome  evaluation  component  and  (2)  a  process   evaluation  component.  The  outcome  evaluation  was  designed  to  measure  what  these   innovations  accomplished  last  semester,  and  to  answer  the  questions:   n n Are   students   within   innovation   DE   programs   more   successful   (in   terms   of   graduation,  retention  and  GPA)  than  those  in  standard  DE  programs?   Which  innovations  are  the  most  successful  (in  terms  of  graduation,  retention   and  GPA)?   To  measure  these  outcomes,  a  case-­‐control  quasi-­‐experimental8  design  was  used.  In  this  design,   student  performance  within  innovation  courses,  measured  by  institutional  data,  was  compared   to  the  performance  of  students  within  control  groups.  These  data  were  supplemented  by  a   student  survey,  which  provided  additional  data  to  ensure  the  differences  observed  between   innovation  and  control  courses  were  not  the  result  of  other  factors.     The  process  evaluation  component  was  designed  to  answer  the  questions:   n n Were  the  innovations  implemented  as  intended?   What  can  the  colleges  and  CCCS  learn  from  the  implementation  of  the  seven   innovations?   To  answer  these  questions,  a  survey  was  administered  to  students  to  support  institutional  data   derived  from  the  Student  Unit  Record  Data  System  (SURDS).    Additionally,  a  faculty  survey  was   administered  and  interviews  were  conducted  with  key  faculty.  The  sections  below  discuss  each                                                                                                                           8  Quasi-­‐experimental  designs  differ  from  experimental  designs  in  that  treatments  or  interventions  are  not  assigned   randomly.  In  this  case,  it  refers  to  the  fact  that  students  were  not  randomly  assigned  to  innovation  courses.       Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   21  
  • 23. Evaluation  of  the  Completion  Innovation  Challenge  Grant       of  these  data  sources  in  more  detail,  how  the  control  groups  were  constructed,  the  analyses   that  were  conducted,  and  the  limitations  of  the  study.   Data  Sources   The  data  used  in  this  study  were  gathered  from  four  sources:  (1)  CCCS  institutional  data,  (2)  a   student  survey,  (3)  a  faculty  survey  and  (4)  interviews  with  faculty.  Each  of  these  sources  is   discussed  below.   Institutional  Data   JVA  worked  with  CCCS  to  access  institutional  data  for  all  students  in  the  study  through  the   Student  Unit  Record  Data  System  (SURDS).  These  data  included  demographics,  grades  and   course  completion  variables.  Student  ID  numbers  were  used  as  unique  identifiers  to  match   these  data  to  student  survey  data  (see  below).  However,  in  an  effort  to  maximize  protection  of   student  data,  student  numbers  were  stripped  from  the  data  once  the  match  was  made  and  new   identifiers  were  assigned.   Student  Survey   In  partnership  with  CCCS,  JVA  designed  and  administered  an  electronic  survey  to  all  students  in   in  the  study.  This  survey  was  designed  to  ascertain  student  satisfaction  with  DE  programming,   and  to  identify  challenges  DE  students  experience  that  may  act  as  barriers  to  graduation.   Student  ID  numbers  were  used  to  match  these  data  to  the  institutional  data  collected  (see   above)  but  were  stripped  once  the  match  was  made.  Additionally,  electronic  informed  consent   was  acquired  as  part  of  the  survey  (see  Appendix  A  for  a  copy  of  the  survey).   Faculty  Survey   JVA  also  worked  with  CCCS  to  administer  an  electronic  survey  to  DE  faculty  and  staff  to  ascertain   the  degree  to  which  faculty/staff  members  feel  each  innovation  is  being  implemented  as   intended  and  faculty  perception  of  the  quality  of  the  innovations.  Skip  logic  was  used,  such  that   respondents  were  presented  with  questions  tailored  to  the  innovations  their  institution  is   implementing.  These  data  are  reported  in  aggregate,  and  all  personal  identifiers  (i.e.,  names  and   email  addresses)  were  stripped  from  the  data.  Additionally,  electronic  informed  consent  was   acquired  as  part  of  the  survey  (see  Appendix  B).   Faculty  Interviews   JVA  conducted  phone  interviews  lasting  approximately  15–30  minutes  with  16  key  faculty  and   staff  members  to  ascertain  the  degree  to  which  each  innovation  is  being  implemented  as   intended,  what  is  going  well  and  what  could  be  improved  upon.  Though  these  data  are  reported   in  aggregate;  to  maintain  confidentiality,  names  are  not  attached  to  any  of  the  data.   Additionally,  verbal  informed  consent  was  acquired  prior  to  engaging  in  the  interview  (see   appendix  C).     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   22  
  • 24. Evaluation  of  the  Completion  Innovation  Challenge  Grant       Control  Group   To  build  control  groups,  students  in  traditional-­‐format  DE  courses  were  identified  and  matched   by  institution  and  course—for  each  innovation  course,  a  corresponding  traditional  course  at  the   same  institution  was  identified.  When  this  was  not  possible,  a  course  at  a  similar  institution   (similar  in  terms  of  size  and  rural/urban  location)  was  identified.  This  process  ensured  that,   whenever  possible,  innovation  courses  were  matched  to  control  courses  at  the  same  institution.   As  such,  institutionally  specific  variables  were  controlled  as  much  as  possible.  Finally,  within   each  innovation  cluster,  control  groups  were  matched  to  the  innovation  groups  along  four   demographic  factors:  (1)  gender,  (2)  ethnicity,  (3)  race9  and  (4)  age.  In  the  findings  section   below,  the  relative  match  between  control  and  innovation  groups  is  identified  for  each   innovation  cluster.   Data  Analysis   The  quantitative  data  contained  within  this  report  (institutional  data  and  survey  data)  were   analyzed  using  SPSS  (a  statistical  analysis  software  package).  Analyses  included  descriptive   statistics  as  well  as  basic  inferential  statistics  including  Chi-­‐squared  distributions,  Pearson’s   correlations,  independent  samples  t-­‐tests  and  analysis  of  variance  (ANOVA).  General   descriptions  of  these  procedures  are  contained  within  footnotes  to  the  procedures  themselves.   The  qualitative  data  contained  within  this  report  (interview  notes  and  open-­‐ended  survey   questions)  were  analyzed  using  NVivo,  a  qualitative  data  analysis  software  package.  Using   NVivo,  JVA  analysts  coded  the  data  by  source  (group)  and  general  themes.  These  original  codes   were  then  reworked  (clustered  and  split)  until  coherent  stand-­‐alone  themes  were  produced.     Study  Limitations   Though  this  evaluation  provides  valuable  information  on  the  CICG  program,  it  suffers  from  some   limitations.  Chief  among  these  is  the  mismatch  between  the  time  horizon  of  the  study  and  the   desired  outcomes.  In  particular,  college  retention  is  a  long-­‐term  measure  that  will  most   effectively  be  measured  over  time.  As  such,  it  is  simply  too  early  to  reach  any  definite   conclusions  regarding  the  impact  these  innovations  have  on  retention  (though  preliminary   findings  are  contained  within).  Over  time,  this  limitation  will  be  partially  mitigated,  as  this  study   will  continue  in  its  current  form  for  another  12  months,  and  then  continue  in  a  modified  form   after  that.    However,  data  on  long-­‐term  student  retention  will  not  be  available  for  several  years   to  come  and  conclusive  data  may  never  become  available  given  the  already  limited  sample  size   and  the  relatively  large  attrition  rates  experienced  by  this  population.                                                                                                                             9  In  these  data,  ethnicity  is  treated  as  a  separate  concept  from  race.  Ethnicity  consists  of  Latino/non-­‐Latino  and  race   consists  of  five  separate  racial  categories.     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   23  
  • 25. Evaluation  of  the  Completion  Innovation  Challenge  Grant       Another  limitation  is  the  ambiguity  contained  within  definitions  of  these  innovations— institutions  define  and  implement  the  same  innovations  somewhat  differently.  This  presents  a   challenge  to  evaluation  by  making  it  more  difficult  to  draw  clear  distinctions  between   innovations.  This  challenge  is  exacerbated  by  the  relatively  small  number  of  students  contained   within  specific  innovations  at  specific  institutions.  This  study  has  partially  overcome  both  of   these  challenges  by  grouping  the  innovations  into  similar  innovation  clusters,  thus  providing   clearly  distinct  groups  with  enough  cases  to  conduct  statistical  analysis.       This  approach  has,  however,  presented  an  additional  limitation.    By  combining  multiple   innovations  into  clusters,  the  analysis  is  unable  to  make  distinctions  between  similar  innovations   within  clusters.    For  example,  though  three  institutions  (ACC,  PPCC  and  TSJC)  are  implementing   open  entry/exit  math  labs,  the  ways  in  which  they  are  doing  so  vary  (see  the  descriptions   above).    This  limitation  is  particularly  stark  for  the  Accelerated,  Compressed,  Contextualized  and   Mainstreaming  cluster.    All  of  the  institutions  involved  in  this  cluster  engage  in  some  form  of   accelerated  and  compressed  developmental  education,  but  several  include  either   mainstreaming  or  contextualization  as  well.    These  issues  are  compounded  by  the  fact  that  CCCS   institutions  vary  dramatically  in  size,  and  as  a  result,  rather  large  portions  of  some  innovation   clusters  are  made  up  of  single  institutions.    This  means  that  a  particular  form  of  an  innovation   implemented  by  a  particular  institution  may  disproportionally  influence  the  results  observed  for   a  particular  cluster.   An  additional  limitation  is  the  generally  small  sample  sizes  for  some  of  the  measures.    In   particular,  the  samples  for  data  from  faculty  (the  faculty  survey  and  interviews  with  faculty)  are   too  small  to  be  considered  representative  of  the  views  of  all  faculty  members.    Data  for  the   students  is  less  limited,  as  sample  size  is  not  a  problem  for  the  institutional  data  grouped  by   innovation  cluster.    However,  the  sample  for  the  student  survey  is  too  small  to  be  considered   representative  of  all  students  in  the  study10.    As  such,  the  generalizability  of  these  findings  is   somewhat  limited  and  extreme  caution  should  be  taken  when  extrapolating  from  these   findings.   Finally,  not  all  of  the  potential  benefits  associated  with  these  innovations  are  measured  by  this   evaluation.  As  a  particularly  cogent  example,  some  innovations  appear  to  be  increasing  the   amount  students  learn  by  slowing  the  pace  at  which  they  do  so  (see  the  Math  Labs  section   below).  While  the  qualitative  data  included  below  are  able  to  partially  capture  this  possibility,   the  extent  to  which  this  is  actually  occurring  is  not  possible  to  determine  here  as  the  data   needed  to  draw  such  conclusions  were  not  collected  as  part  of  this  evaluation.                                                                                                                           10  The  margin  of  error  for  the  student  survey  sample  (a  sample  of  153  from  a  population  of   1,527)  is  7.52%  at  a  95%  confidence  level.    To  attain  a  more  generally  acceptable  margin  of  error   of  5%  while  retaining  a  95%  confidence  level,  a  sample  of  308  would  have  been  needed.     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   24  
  • 26. Evaluation  of  the  Completion  Innovation  Challenge  Grant       Despite  these  limitations,  this  evaluation  provides  valuable  information  on  the  progress  made   by  the  CICG  project.    Though  these  findings  cannot  be  considered  conclusive,  they  do  provide  a   sense  of  how  the  project  has  progressed  and  what  it  has  accomplished  thus  far.   Findings   Findings  are  presented  in  six  sections  below:  (1)  student  demographics,  (2)  student  experience,   (3)  math-­‐lab  innovation  cluster,  (4)  accelerated,  compressed,  contextualized  and  mainstreaming   innovation  cluster,  (5)  online  hybrid  innovation  cluster,  and  (6)  modularization  and  diagnostic   assessment  innovation  cluster.   Student  Demographics   Student  demographic  data  for  this  study  are  from  two  sources:  (1)  institutional  data  and  (2)  the   student  survey.  Data  from  each  of  these  sources  are  presented  below.   Overall  (Institutional  Data)   Figure  1  below  displays  the  gender  breakdown  for  the  entire  sample.  As  shown  below,  more   than  half  (55%)  of  the  sample  is  female.   Figure  1.  Gender  for  the  Entire  Sample  (n  =  1,527)   45%   55%   Male   Female     Table  5  below  displays  the  ethnic  break  down  (Latino,  not  Latino)  for  the  sample.  As  shown   below,  roughly  one-­‐fifth  (20.3%)  of  the  sample  identifies  as  Latino.  Figure  2  below  shows  the   racial  breakdown  for  the  sample.   Table  5.  Percentage  Latino  and  Not  Latino  for  Entire  Sample  (n  =  1,527)   Ethnicity                        Percentage  of  Sample   Latino   20.3%   Not  Latino   79.7%       Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012   25  
  • 27. Evaluation  of  the  Completion  Innovation  Challenge  Grant   26       As  shown  below,  almost  three-­‐fifths  (58%)  of  the  sample  identifies  as  white,  and  just  over  one-­‐ fifth  (22%)  did  not  identify  as  any  of  the  available  racial  categories.  Additionally,  the  mean  age   for  this  sample  was  28.06  (SD  =  9.676),  ranging  from  17  years  old  to  72  years  old.   Figure  2.  Race  by  Group  for  the  Entire  Sample  (n  =  1,527)   100%   80%   58%   60%   40%   22%   20%   3%   10%   2%   5%   1%   0%   Asian   Black   Naove  American   Pacific  Islander   White   Mixed   Not  ID'd     Student  Survey  Respondents   Figure  3  below  displays  the  gender  breakdown  for  the  student  survey  respondents.  As  shown   below,  just  over  two-­‐thirds  (70%)  of  student  survey  respondents  identified  as  female.  This  is  a   higher  proportion  than  for  the  sample  as  a  whole.   Figure  3.  Gender  for  Survey  Data  (n  =  153)   30%   Male   Female   70%     Table  6  below  displays  the  ethnic  break  down  (Latino,  not  Latino)  for  survey  respondents.  As   shown  below,  just  under  one-­‐fifth  (17.0%)  of  the  sample  identifies  as  Latino.  This  is  slightly   lower  than  the  sample.     Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012  
  • 28. Evaluation  of  the  Completion  Innovation  Challenge  Grant   27       Table  6.  Percentage  Latino  and  Not  Latino  for  Survey  Data  (n  =  153)   Ethnicity                        Percentage  of  Sample   Latino   17.0%   Not  Latino   83.0%     Figure  4  below  shows  the  racial  breakdown  for  student  survey  respondents.  As  shown  below,   almost  two-­‐thirds  (65%)  of  survey  respondents  identify  as  white,  which  is  higher  than  the   sample  as  a  whole.  Additionally,  16%  did  not  identify  as  any  of  the  available  racial  categories,   which  is  lower  than  the  sample  as  a  whole.     Figure  4.  Gender  by  Group  for  Student  Survey  Respondents  (n  =  153)   100%   90%   80%   70%   60%   50%   40%   30%   20%   10%   0%   65%   2%   Asian   Black   8%   16%   3%   Naove  American   6%   1%   Pacific  Islander   White   Mixed   Not  ID'd     As  shown  in  Table  7,  below,  the  mean  age  for  survey  respondents  is  31,  which  is  three  years   older  than  the  average  for  the  sample  as  a  whole.  Additionally,  survey  respondents  have  an   average  of  almost  one  child  (for  both  children  under  18  generally  and  children  under  18  living   with  the  respondent).   Table  7.  Mean  (SD)  Age,  Number  of  Children  Under  18  and  Number  of  Children  Under  18  Living   with  Respondent  for  Student  Survey  Data  (n  =  153)   Item                                                Mean  (SD)   Age  of  respondent   31.31(11.26)   Children  under  18   0.97(1.35)   Children  under  18  living  with  respondent   0.99(1.19)       Prepared  by  JVA  Consulting  for  Complete  College  Colorado,  September  2012