Ce diaporama a bien été signalé.
Nous utilisons votre profil LinkedIn et vos données d’activité pour vous proposer des publicités personnalisées et pertinentes. Vous pouvez changer vos préférences de publicités à tout moment.

The Role of Data in Becoming a High Performing School

411 vues

Publié le

This on-demand webinar shares how schools can utilise data to become a high performing school.

Speakers:
- Dr Phil Cummins, Managing Director, CIRCLE. Phil covers the 7 rules of being data driven in your school.
- Dave Vannette, Principal Research Scientist, Qualtrics. Dave covers best practices for survey questionnaire design.

Publié dans : Technologie
  • Soyez le premier à commenter

The Role of Data in Becoming a High Performing School

  1. 1. The Role of Data in a High Performing School
  2. 2. House Keeping The recording and slides for today’s presentation will be made available as soon as possible. Please use the question window to submit questions throughout the webinar. We have time designated at the end for Q&A.
  3. 3. Webinar Presenters Phil Cummins Managing Director CIRCLE   Dave Vannette Principle Research Scientist QUALTRICS  
  4. 4. Today’s Agenda •  7 rules of being data driven in your school •  How to engage school community members, analyse data patterns, use data to construct better outcomes for more learners •  Best practices for survey questionnaire design •  K12 survey templates
  5. 5. Webinar Presenters Phil Cummins Managing Director CIRCLE   Dave Vannette Principal Research Scientist QUALTRICS  
  6. 6. CONTEXT: ABOUT US CIRCLE – The Centre for Innovation, Research, Creativity and Leadership in Education –  Working with over 1,750 schools internationally –  An educational agency that equips, empowers and enables schools and school leaders through consultancy and educational services –  Achieving better outcomes for more learners by building cultures of excellence in leadership and learning in communities of inquiry –  Strategic alliances with tertiary bodies (including the University of Tasmania) and professional associations –  Creating educational software solutions for improving school performance including Touchstones Dr Philip SA Cummins phil@circle.education Managing Director, CIRCLE Adjunct Associate Professor, Faculty of Education, University of Tasmania Working in and with schools since 1988 www.circle.education www.mytouchstones.com @CIRCLEcentral
  7. 7. FRAMEWORK FOR 21st CENTURY LEARNING PARTNERSHIP FOR 21st CENTURY SKILLS March 2011
  8. 8. MODEL FOR 21st CENTURY EDUCATION CENTER FOR CURRICULUM REDESIGN January 2016
  9. 9. Data-­‐informed  research  &  prac3ce   Teacher  performance  &  professional  development   Literacy  &  numeracy  benchmarking   Con3nuous  improvement   ICT  in  learning  benchmarking   Standards-­‐referenced  curriculum   Forma3ve  assessment   THE INTERNATIONAL EDUCATIONAL LANDSCAPE
  10. 10. 7 PRINCIPLES FOR USING DATA IN SCHOOLS Drawn from our work with over 1,750 schools internationally since the early 1980s
  11. 11. 1. Mission alignment Understand your purpose and concentrate your activity on this goal; don’t spread your resources too widely. 7 PRINCIPLES FOR USING DATA IN SCHOOLS
  12. 12. 7 PRINCIPLES FOR USING DATA IN SCHOOLS 2. Open inquiry Ask good, open-ended questions; don’t expect a particular outcome
  13. 13. 3. Measure what you do Beware substituting a feeling or perception about a successfully run event or program for real data about long-term impact on practice and performance – most schools never measure the impact of PL, especially on teacher capability and student learning 7 PRINCIPLES FOR USING DATA IN SCHOOLS
  14. 14. 4. Dynamic explication and iteration Define your processes, test and iterate; don’t lock things down too soon 7 PRINCIPLES FOR USING DATA IN SCHOOLS
  15. 15. 5. Contextualised interpretation Analyse data by finding patterns that tell the real story; don’t let data speak for itself 7 PRINCIPLES FOR USING DATA IN SCHOOLS
  16. 16. 6. Balanced judgment Temper data with intuition.. Educators must be experts in the evaluation of data. John Hattie, Visible Learning, 2009 Systems 1 Thinking (thinking that is fast, intuitive, often unconscious, relying on past association of ideas) complements Systems 2 Thinking (thinking that is slow, conscious, reasoned, full of effort and ultimately often lazy) – Systems 1 Thinking is the hero because it of its high correlation with the truth. Daniel Kahnemann, Thinking Fast and Slow, 2012 11% of decisions made by marketing professionals are based on data; 16% of decisions are made based on too much data. Harvard Business Review, September 2012 7 PRINCIPLES FOR USING DATA IN SCHOOLS
  17. 17. 7. Collaborative improvement Use the findings to help engage all members of the community to construct better outcomes for more learners 7 PRINCIPLES FOR USING DATA IN SCHOOLS
  18. 18. 1.  Mission alignment 2.  Open inquiry 3.  Dynamic explication and experimentation 4.  Wise measurement 5.  Contextualised interpretation 6.  Balanced judgment 7.  Collaborative improvement What might this look like in your school? What should this feel like in your school? USING DATA IN YOUR SCHOOL
  19. 19. Your school’s data culture: Add up your scores and divide by 7. How did you rate yourself? What’s working well? What needs attention? How do I rate my school in each of these? 1= Below expectation 2 = Meets expectation 3 = Above expectation 1.  Mission alignment 2.  Open inquiry 3.  Dynamic explication and experimentation 4.  Wise measurement 5.  Contextualised interpretation 6.  Balanced judgment 7.  Collaborative improvement USING DATA IN YOUR SCHOOL
  20. 20. 1.  Mission alignment: Understand your purpose and concentrate your activity on this goal; don’t spread your resources too widely. 2.  Open inquiry: Ask good questions; don’t expect a particular outcome. 3.  Dynamic explication and experimentation: Define your processes, test and iterate; don’t lock things down too soon. 4.  Wise measurement: Use grand school averages and value-added models; avoid benchmarks where possible. 5.  Contextualised interpretation: Analyse data by finding patterns that tell the real story; don’t let data speak for itself. 6.  Balanced judgment: Temper data with intuition. 7.  Collaborative improvement: Use the findings to help engage all members of the community to construct better outcomes for more learners. One thing: •  You know more about •  You feel more confident about •  You might use at your school tomorrow •  You might think about carefully for a long time before using at your school Your takeaways … 7 PRINCIPLES FOR USING DATA IN SCHOOLS
  21. 21. Webinar Presenters Dave Vannette Principal Research Scientist QUALTRICS   Phil Cummins Managing Director CIRCLE  
  22. 22. Writing Good Survey Questions
  23. 23. Writing Good Survey Questions Data quality The cognitive response process Satisficing
  24. 24. Reliability refers to the extent to which our measurement process provides consistent and repeatable results. –  Internal consistency (high inter-item correlation for measures of the same construct) –  Temporal stability (test-retest reliability) DATA QUALITY: RELIABILITY
  25. 25. Validity refers to the extent to which our measurement process is measuring what we intend to be measuring. –  Content validity – how well does your sample of response options reflect the domain of possible responses to the question? –  Criterion-related validity (aka “predictive” or “concurrent” validity) – what is the strength of the empirical relationship between question and criterion (“gold standard”)? –  Construct validity – how closely does the measure “behave” like it should based on established measures or the theory of the underlying construct –  Face validity – what does the question look like it’s measuring? DATA QUALITY: VALIDITY
  26. 26. So what is going on in a parent or student’s head when they take a survey?
  27. 27. 1.  Understand intent of question. What is meant by the question, as it may differ from the literal interpretation of the words COGNITIVE STEPS IN PROVIDING AN ANSWER
  28. 28. 1.  Understand intent of question. What is meant by the question, as it may differ from the literal interpretation of the words 2.  Search memory for information. Identifying relevant information stored in memory COGNITIVE STEPS IN PROVIDING AN ANSWER
  29. 29. 1.  Understand intent of question. What is meant by the question, as it may differ from the literal interpretation of the words 2.  Search memory for information. Identifying relevant information stored in memory 3.  Integrate information into summary judgment. Synthesizing information from memory and making determinations about knowledge or attitudes COGNITIVE STEPS IN PROVIDING AN ANSWER
  30. 30. 1.  Understand intent of question. What is meant by the question, as it may differ from the literal interpretation of the words 2.  Search memory for information. Identifying relevant information stored in memory 3.  Integrate information into summary judgment. Synthesizing information from memory and making determinations about knowledge or attitudes 4.  Translate judgment onto response alternatives. Formatting the summarized information into an acceptable response based on the available question response options COGNITIVE STEPS IN PROVIDING AN ANSWER
  31. 31. 1.  Understand intent of question. What is meant by the question, as it may differ from the literal interpretation of the words 2.  Search memory for information. Identifying relevant information stored in memory 3.  Integrate information into summary judgment. Synthesizing information from memory and making determinations about knowledge or attitudes 4.  Translate judgment onto response alternatives. Formatting the summarized information into an acceptable response based on the available question response options COGNITIVE STEPS IN PROVIDING AN ANSWER Optimising!
  32. 32. Do you really expect parent and students to optimise for every question?
  33. 33. Shortcutting the optimal response process: Weak Satisficing: Incomplete or biased memory search and/or information integration Strong Satisficing: Skipping memory search and/or information integration altogether and cueing off the question or context for plausible answers SATISFICING
  34. 34. Task difficulty –  Interpretation (e.g., number of words, familiarity of words, multiple definitions) –  Retrieval (e.g., current vs. past state, single vs. multiple objects or dimensions) –  Judgment (e.g., absolute vs. comparative) –  Response selection (e.g., verbal vs. numeric scale labels, familiarity of words, multiple definitions of words) CAUSES OF SATISFICING
  35. 35. Task difficulty Respondent ability –  Cognitive skills –  Experience thinking about the topic –  Pre-consolidated judgments CAUSES OF SATISFICING
  36. 36. Task difficulty Respondent ability Respondent motivation –  Need for cognition –  Accountability –  Personal importance of the topic –  Belief about survey’s importance –  Number of prior questions CAUSES OF SATISFICING
  37. 37. •  Selecting the first reasonable response –  Order of response options can affect answers “How awesome is Qualtrics?” »  Extremely awesome »  Very awesome »  Somewhat awesome »  Slightly awesome »  Not at all awesome –  Visual presentation = primacy (the first reasonable response seen) –  Tip: Randomize the direction of the response scale whenever possible FORMS OF SATISFICING BEHAVIOUR
  38. 38. •  Selecting the first reasonable response •  Agreeing with assertions –  Acquiescence bias •  You may know people that run into this every time they order at Starbucks… FORMS OF SATISFICING BEHAVIOUR
  39. 39. •  Selecting the first reasonable response •  Agreeing with assertions –  Acquiescence bias •  You may know people that run into this every time they order at Starbucks… “Is that with soymilk?” “Yes” FORMS OF SATISFICING BEHAVIOUR
  40. 40. •  Selecting the first reasonable response •  Agreeing with assertions –  Acquiescence bias •  Agree-Disagree (Likert) scales •  True/False •  Yes/No –  Generally avoid any form of these response scales FORMS OF SATISFICING BEHAVIOUR
  41. 41. •  Selecting the first reasonable response •  Agreeing with assertions –  Acquiescence bias •  This can be avoided on every order at Starbucks… FORMS OF SATISFICING BEHAVIOUR
  42. 42. •  Selecting the first reasonable response •  Agreeing with assertions –  Acquiescence bias •  This can be avoided on every order at Starbucks… “Is that with regular or soy milk?” “…yes?” FORMS OF SATISFICING BEHAVIOUR
  43. 43. •  Selecting the first reasonable response •  Agreeing with assertions •  Straightlining •  Worst in matrix/grid question types •  Tip: Avoid any use of matrix or grid questions FORMS OF SATISFICING BEHAVIOUR
  44. 44. •  Selecting the first reasonable response •  Agreeing with assertions •  Straightlining •  Saying “don’t know” (DK) –  Easier than thinking of an answer –  DK/no opinion is not the same as selecting a neutral or middle alternative •  Tip: Generally avoid DK/no opinion response options FORMS OF SATISFICING BEHAVIOUR
  45. 45. •  Selecting the first reasonable response •  Agreeing with assertions •  Non-differentiation in ratings •  Saying “don’t know”  •  Mental coin-flipping FORMS OF SATISFICING BEHAVIOUR
  46. 46. There are two primary levers that we can operate on to reduce satisficing: 1.  Task difficulty •  Make questions as easy as possible •  Minimise distractions •  Keep the duration short COMBATING SATISFICING
  47. 47. There are two primary levers that we can operate on to reduce satisficing: 1.  Task difficulty •  Make questions as easy as possible •  Minimise distractions •  Keep the duration short 2.  Respondent motivation •  Leverage survey importance •  Keep the duration short •  Use incentives and encouragement to increase engagement COMBATING SATISFICING
  48. 48. 1.  Begin from desired data set and drill down to each individual survey question 2.  Be aware of the cognitive response process – and make it easy 3.  Satisficing is a big threat – don’t enable it with your questionnaires REVIEW
  49. 49. 1.  “Survey Research” by Krosnick (Ann. Rev. Psych, 1999) 2.  “The Psychology of Survey Response” by Tourangeau, Rips, & Rasinski (2000) 3.  “The Science of Asking Questions“ by Schaeffer & Presser (Ann. Rev. Soc, 2003) 4.  “Thinking About Answers” by Sudman & Bradburn (1996) 5.  “Question and Questionnaire Design” by Krosnick & Presser (in the Handbook of Survey Research, 2010) 6.  “Answering Questions: A Comparison of Survey Satisficing and Mindlessness” by Vannette & Krosnick (The Wiley Blackwell Handbook of Mindfulness, 2014) 7.  “The Palgrave Handbook of Survey Methodology” by Vannette & Krosnick (forthcoming from Palgrave in 2016) FURTHER READING
  50. 50. Q&A
  51. 51. Thank you
  52. 52. Phil Cummins phil@circle.education www.circle.education www.mytouchstones.com @CIRCLEcentral   Dave Vannette davev@qualtrics.com www.qualtrics.com/k12 @Qualtrics   Do you have other questions? Do you want to know more?

×