In May 2018 I ran an e-Assessment workshop for members of the Griffith University Assessment Committee.
Topics included:
- What do we already understand about digital assessment
- What are our current pain-points
- We will identify where these sit on our assessment lifecycle
- Talk through some of the emerging tools and techniques, such as:
- Contract cheating and some ways to address this
- Digital exams and proctoring some tools now available
- Conditional assessments and Marking tools
- Looking at what’s possible in Office 365 + BB
- Use of voice in assessment
2. What we will get-up-to this morning
What do we already understand about digital assessment
What are our current pain-points
We will identify where these sit on our assessment lifecycle
Talk through some of the emerging tools and techniques, such as:
Contract cheating and some ways to address this
Digital exams and proctoring some tools now available
Conditional assessments and Marking tools
Looking at what’s possible in Office 365 + BB
Use of voice in assessment
Action planning and aligning with our current L&T principles/priorities
Addressing things not covered
michael_sankey
4. Project team
Lead institution
• University of South Australia (Project co-leaders: Tracey Bretag & Rowena Harper)
Partner institutions
• Griffith University (Karen van Haeringen)
• University of NSW (Cath Ellis)
• University of Sydney (Pearl Rozenberg)
• Swansea University, UK (Phil Newton)
Data analyst
• Michael Burton (University of Western Australia)
Project Manager
• Sonia Saddiqui (University of South Australia)
michael_sankey
5. Respondents
N = 14,086
• Eight universities from six states – NSW, VIC, QLD, TAS, SA, WA
• 57% Female, 41% Male
• 29% 17-20 years old, 37% 21-25, 12% 26-30, 12% over 30
• 69% Undergraduates, 21% Postgraduate Coursework, 9% Postgraduate
Research
• 85% Domestic, 15% International
• 65% Internal students, 26% Blended mode, 9% External (online only)
• 79% English speaking, 21% Language Other than English (LOTE)
• 50% Group of Eight, 50% non-Group of Eight
6. Key findings
Prevalence and nature of contract cheating
• Relatively few students (6%) have engaged in contract cheating
• Lots of students are sharing… and sharing is twice as common
among students who have cheated
• Despite the widespread availability of file-sharing websites and
commercial cheating services, students still primarily engage in
outsourcing with people they know: students, friends, and family
7. Key findings
Individual, contextual and institutional factors correlated with
contract cheating:
• Dissatisfaction with the teaching and learning items
• Perceptions that there were lots of opportunities to cheat
• Speaking a language other than English at home (for
arranging for someone to assist with or complete an exam)
• Domestic student status (both exam impersonation
behaviours)
8. Key findings
Staff experiences with contract cheating
• Almost 70% of teaching staff have suspected outsourced
assignments at least once
• Knowing the student is an important signal
• Although most referred cases are substantiated and penalised,
many staff do not pursue suspected breaches
• misperceptions it is ‘impossible to prove’
• not informed about outcomes
• concerns about penalties
9. Key findings
Staff and student attitudes towards contract cheating
• Staff consider contract cheating to be a serious matter
• Although students tend to believe cheating is ‘wrong’, most
are not concerned that students are engaging in it
• Staff and students alike are ambivalent about note-sharing
10. Key findings
Individual, contextual and institutional factors correlated with
contract cheating
• Departmental and institutional academic integrity policies and
practices are perceived to help minimise contract cheating
• Practical conditions of teaching (workload, contact time, class
sizes) are perceived barriers to minimising contract cheating
• The performance review and reward environment is not
perceived to incentivise minimisation of contract cheating
11. Cheating can be minimised by:
• encouraging students to get more concerned about this problem
• recognising the particular needs of LOTE students
• improving the teaching and learning environment
• reducing opportunities to cheat through course & assessment
design
• supporting educators to know their students
• ensuring processes of breach detection, reporting,
substantiation, penalisation and communication
michael_sankey
12. Identifying our assessment pain-
points
michael_sankey
Individually, identify 3 key pain-points around
digital assessment that you either currently
experience, or are aware of
Write each one down on a sticky note
As your table share what your points are
Assign a scribe at your table to collect an
agreed-on set of three pain-points
Each table to share this with the room and
we will then place them on the assessment
life cycle.
13. Just a few contenders
michael_sankey
+ 20
More
14. Example - ExamSoft
michael_sankey
ExamSoft is an assessment platform currently being used at 1,400
institutes worldwide. Uni’s such as The University of Queensland,
Bond, Macquarie and the likes of Harvard Medicine overseas.
ExamSoft allows large numbers of students to take their exams on
their own BYOD laptop or a school owned device, securely and offline.
This device can a PC, Mac or iPad.
17. ProctorU
We have online proctoring with ProctorU for the following courses (final exams):
7801AFE Investments
7817AFE Income Tax Law
7823AFE Principles of Business and Corporations Law
7822AFE Applied Taxation
7806AFE Econometric Methods
7810AFE Derivatives and Risk Management
We are currently investigating expanding authentication and security methods to
include 'keystroke analysis' with ProctorU
michael_sankey
23. What a few others are doing
Institution Policy and practice Technology
UTS Exploring BYOD options and designing assessment to motivate students
to learn
Blackboard quizzes, Respondus
QUT Currently exploring options Currently exploring options
UC Exploring options to provide online exams and proctoring of remote
students
Canvas, reviewing Respondus and
Proctorio
UQ Exploring options via en eExams project UBlackboard quizzes in lab and
take-home tests, piloted
ExamSoft
Sydney - Canvas, piloting Examity
Murdoch Exploring BYOD Moodle quizzes, SafeExam
Browser extension
UNE Developing ideas around best practice Proctor U
Griffith Proctoring of Examinations Policy, exploring options more broadly Proctor U
CSU Operating in pilot mode and looking to review policies and procedures Proctor U, Blackboard test centre
24. The project aims to securely and sustainably leverage 'bring your own devices'
(BYO laptops) for high stakes, face-to-face invigilated examinations, primarily
for on-campus context.
michael_sankey
25. 5 online alternatives to exams
Adaptive testing
Testing based on an item bank, pinpointing where learners can improve. When they get a
question wrong, an easier one is generated, if they get one right a harder one comes up.
On-screen testing
Testing for non-written items, i.e. pictures, diagrams and drag-and-drop tasks.
Randomizing questions mean there's no danger of students copying each other.
Simulations
Immerse them in the simulation of a real-life scenario and assess how they respond to
applying their knowledge in that context. Delivered using tablet-based technologies.
e-Portfolio
Allow student work to be verified, graded with remote feedback. They have an up-to-date,
representation of their competencies. Ideal for demonstrating practical skills.
Work-readiness
Work placements and reflections used in conjunction with other assessments.
michael_sankey
26. michael_sankey
Some other assessment alternatives
Advertisement
Analysis and response to a
case study
Analysis of data or a graph
Analysis of an event,
performance, or work of art
Annotated bibliography Brochure Chart, graph, or diagram Debate
Description of a process
Development of a product
or proposal
Diagram, table, chart, or
visual aid
Diary entry for a real or
fictional character
Executive summary Explanation of a multiple-
choice answer
Introduction to a research
paper or essay
Legal brief
Letter to a friend Literature review Meaningful paragraph
Newspaper article or
editorial
Performance: e.g., a
presentation
Policy memo or executive
summary
Work of art, music,
architecture, sculpture
Practical exam or evaluation
of lab skills
Poem, play, or dialogue A Portfolio PowerPoint presentation
A Reflection on what they
have learned
Research proposal
Review of a book, play,
performance
Scientific abstract Create a Poster
27. Our Griffith suite of tools
The Tech Ecosystem
A discovery tool developed by Learning Futures
Used to find information about applications and
technologies to assist in delivering better teaching
To help discover learning and teaching practices,
technologies and strategies to inspire and engage
our learners.
michael_sankey
33. Application of artificial intelligence to assessment
Artificial intelligence
the ability of automated systems to perform tasks that until recently
required human or other biological information processing.
+
Machine learning
Algorithmic processes that operate on data sets and are then able to
cluster, classify, recognize, or identify patterns in new data.
michael_sankey
35. The human voice in assessment
Speech recognition allows the voice to serve as the main interface
Originally used for students with disabilities
Both Microsoft and Apple have built-in speech recognition capabilities
Presentation tools (PowerPoint, Prezi), Camtasia, podcast, vodcast, Skype,
audio and video conference, voice threading, collaboration tools (Blackboard
collaborate).
michael_sankey
36. Enables more efficient, reliable, and authentic assessment
Speech recognition technologies capture and analyze speech
for words, comprehension, and prosody
English Language Learners for spoken responses to short
answer tasks; speech recognition and text evaluation
technologies return diagnostic and comprehensive measures of
language skills (reading, writing, speaking, and listening)
ML technologies can be applied to any content area, including
science, social studies and math
michael_sankey
37. Action planning
If we could influence the assessment committee what would be our
three priorities for digital assessment (moving forward)?
What are we going to do?
What are the barriers?
How could they be overcome?
What resources are we going to use/need?
Who will be responsible for action/s?
When are we going to do it (timelines)?
When can schools expect resources to enable ‘roll out’?
michael_sankey