I work with organisations and institutions to undertake targeted research to investigate specific online education issues being experienced by their educators or students. Results from these diverse quantitative and qualitative projects are used to inform the development and testing of online education strategies aimed at achieving specific outcomes.
A few select case study examples are described below.
If your team would like to partner with me on an original research project aimed at better meeting the needs of your online educators and students, please get in contact.
Project: Video content representing wasted time and resources
Result: Very high student watch rates and significantly improved satisfaction
The partner organisation for this project had observed that despite having invested heavily in the development of video content for their 3 largest programs, learning management system data revealed that students were not watching those videos. Furthermore, anecdotal evidence from the marking team suggested that students were consequently missing critical learning information.
This project utilised a mix of methods to measure video click rates, turn-off rates including analysis of content types and duration trends, look-away rates (that is, students ostensibly watching the video but attention actually on other things such as social media scrolling on a different device), and focus group students reasoned aloud as they made decisions about playing, not playing, and turning off videos.
Insights from this work informed subsequent trials of how videos were framed within the learning materials, the duration of individual videos, and the nature of individual and series video content. From those trials, a set of guidelines were developed, and implementation results continue to be monitored including follow-up student focus groups. Core content videos at that organisation now achieve >90% student watch rates, students report significantly improved satisfaction with the video content, and improved learning outcomes have been observed from significantly improved assessment results.
The subsequently refined guidelines from this project inform parts of my webinar Guidelines for making video content that students will actually watch (see here).
Project: Students attempting assessments without first engaging with key learning content elements
Result: More students clicking more content more often, improved learning outcomes
The partner organisation for this project had observed that a large proportion of their students were submitting sub-par assessments which required extensive rework for resubmission. The workload on educators to provide detailed feedback to guide those resubmissions was problematic. Furthermore, students were frustrated at having to re-do assessments multiple times.
The first stage of this project quantitatively measured what learning content was and wasn’t being engaged with (e.g. the clicking on / opening up of self-paced items, and attendance at real-time interventions). Student focus groups then revealed student thought processes as they scrolled and clicked their way through online learning content. This revealed several critical ‘why’ reasons around student decision-making.
The second stage of this project tested a range of framing and communication methods for different types of online learning materials, content, and interventions.
Insights from this work informed subsequent trials around different types of learning content with a particular focus on how each element was presented (described) on its own and within overarching study guides.
The subsequent strategies that were developed from this work were then trialled and refined at multiple sites. Each site reported significantly higher rates of student engagement – that is, more students clicking on more content more often. While separate from this project, student satisfaction surveys at the end of the trial semesters revealed overall improved outcomes.
The learnings from this project inform the content of my webinar Click-worthy learning content (see here).
Project: Educators frustrated by lack of student engagement and attendance at live online sessions
Result: Consistently well-attended and effective sessions with high student satisfaction
The partner organisation for this project had observed two main issues. First, time and resources were being wasted hosting poorly attended live online sessions. Second, educators reported being frustrated and exhausted with hit-and-miss nature of live online sessions.
The project started with surveys and then interviews to identify (a) what students like and dislike about live online sessions, (b) why they choose to attend or not attend different live online sessions, and (c) what they perceive the ideal purpose of live online sessions should be. The project then observed live session observations and conducted debriefings with student focus group.
The findings from those investigations were used to develop strategies around live online session purpose, framing, and facilitation. Trialled and refined at multiple locations, some strategies were found to be very contextual to the topic or student cohort at hand. However, some strategies were found to be universally effective. Educators reported they felt less drained when hosting sessions, that students were attending in high numbers and actively engaging, and that learning outcomes were improved. Students reported significantly higher satisfaction with the nature of the available live online sessions and the learning opportunities these sessions facilitated.
The strategies from this project that were found to be effective across multiple sites and cohorts feature in my webinar Thriving live online (see here).
Project: Students and educators each reporting that the other group doesn't communicate
Result: New, highly impactful framework for writing about-me educator introductions
The partner organisation for this project had identified from student feedback surveys that students perceived the educators were not approachable or available. The organisation was disappointed with the low grades being achieved by students, and had observed high drop-out rates.
Introductory focus groups with students indicated frustration that they couldn’t see who the educator would be prior to semester start. Furthermore, they indicated placing a high level of importance on the educator about-me pages or descriptions.
Extensive quantitative research using learning management system data revealed that one of the first pieces of content or information proactively engaged with by students was the educator about-me page (sometimes called a contact page, bio, self-introduction, etc).
Subsequent investigation using surveys and reason-aloud interview techniques revealed that trust, willingness-to-follow, and optimism-for-learning indicators were strongly affected by how the student perceived the educators credibility and teaching style, wholly informed by that about-me introduction. Students reported these factors affected their learning actions, time spent, and degree of effort.
After analysing different styles of about-me introductions that had yielded different student reactions, a communications expert was consulted. They guided the development of a framework for writing an effective about-me introduction. This framework was taught to, and implemented by, a large group of educators, and follow-up work to capture student perceptions and actual learning behaviours enabled the framework to be further refined.
That framework forms the basis of my webinar Don’t let your ‘about me’ educator intro block student engagement (see here).