Monitoring student engagement and progress in Moodle
An overview of the tools and functions in Moodle that can be used to monitor students’ engagement and progress as they complete a course’s resources and activities.
15 November 2021
Last updated: 30 November 2021
This toolkit outlines the tools and functions in Moodle that can be used to monitor students’ engagement with a course’s resources and activities.
It complements the Toolkit Encouraging student engagement with blended and online learning which sets out pedagogical strategies to encourage and sustain engagement. The tools outlined below allow enrolled tutors and course administrators to monitor whether these student engagement strategies are having the desired effect(s).
The toolkit also outlines ways to enable students to monitor and review their own progress and engagement with Moodle activities and resources.
This toolkit introduces ways monitor students’ progress and quickly identify students who may be falling behind or over-engaging. It will help you set up a Moodle course in a way that enables you to answer the following questions:
- What Moodle resources are students accessing or finding useful?
- What is the frequency of access and how much time may students be spending on each item/task?
- What areas do students seem to struggle with?
- Which individuals appear to be disengaged?
- Has an intervention to engage students been successful?
This toolkit is most useful when you are preparing a Moodle course for a new cohort; this allows you to set up monitoring tools in time to capture students’ engagement data from their first time accessing the course.
What we mean by 'engagement' and 'progress'
In the context of teaching and learning, ‘engagement’ can be broadly defined as a set of positive student behaviours. These include:
- attention to and completion of work;
- active involvement in assigned work; and
- similar involvement in their interactions with peers, the teaching team and the wider university community.
By ‘progress’, we mean the sequence of tasks a student does, or engages with, as they move through a course over time. 'Progress’ also refers to what a student may accomplish while on this learning journey, namely:
- Evidence of improvement,
- Key performance indicators awarded (such as scores),
- Intended learning outcomes and objectives.
There is significant overlap and interaction between both terms in this context. For example, a student could be viewed as having ‘engaged’ with a resource on a Moodle course if there is evidence that they opened it. But it is not possible to know if their knowledge and/or understanding ‘progressed’ as a result of opening this resource without successfully engaging the student in an activity designed to evaluate the student’s learning regarding this resource.
How to identify student engagement and progress
We suggest that you ‘design in’ specific activities at various parts of the module where you can gather engagement data from your cohort. These should be part of the student journey but can act as an additional ‘temperature check’ to ensure students are participating.
The Moodle tools and functions listed below may flag issues early on, giving more time to implement interventions that could help struggling students, result in better learning outcomes or improve student evaluations etc. Bear in mind that any such interventions take time and may need additional preparation (e.g. new sessions or resources).
Interpreting the data
For each of the monitoring tools described below, we include a notice of caution: it is not always easy to interpret engagement data. For example,
- What does it mean when students revisit a resource or activity frequently? Is it useful or confusing?
- Why do some individual students appear to spend excess amounts of time engaging with a particular aspect of course?
- What does it mean when students just don’t engage with important parts at all, or rarely?
The data is most useful as one data point in a broader, ongoing conversation we recommend you have with your students about how interesting, relevant and challenging they are finding your Moodle materials and activities.
Tools for monitoring
Click on a Moodle tool below to find out more about what it activity it monitors, how it works and tips on interpreting the data it produces.
- 'Activity Completion', 'Restrict Access' and 'Course Completion' tools
What do they monitor?
Staff can:
- Easily get a sense of the extent to which individual, or cohorts of students are engaging with certain items on their course and hence with the module overall.
- Control students’ access to specific items based on tutor set criteria, such as a date or completion or engagement with another task.
Students can:
- gain a sense of the 'volume' of the course, how much they have completed, and how much is left to go.
- take ownership of the course by checking off items as they progress through the course.
How do they work?
- Staff configure the course to display a checkbox beside each selected resource and activity.
- Students can tick a checkbox to indicate task completion or engagement. Conditions can be set so that the activity is automatically 'ticked' as completed once the student has fulfilled the requirements (e.g. creating a forum post and replying to two others; opening an article; responding to a question).'
- Students can see a report listing their ticked/completed and remaining tasks on the course.
- Staff can see a report listing enrolled students and the resources and activities they have ticked/completed.
- Staff can also restrict students’ access to resources and activities in-line with pre-set criteria.
Interpreting the data
Although these tools can identify students who may seem not to be engaging with certain resources or activities, or who may be ‘gaming’ the system e.g. by ticking tasks well in advance of them being covered, the simple Activity report on its own cannot confirm that specific students are definitely disengaged or explain the reasons behind any such disengagement. They should be viewed as a data point to start or facilitate an individual or group discussion with your student(s). You could build on the data by using:
- individual emails, e.g. using Quickmail.
- a quick ‘temperature check’ survey of some or all of the cohort using the Feedback tool .
- a conversation in class or a forum that facilitated anonymous posts may be appropriate. (See: Discussion Forums > How do I…? Set up an Anonymous Forum).
It is important to be transparent in the use of learning analytics of this kind. Let students know at the outset that you can see whether or not items have been ticked. Some students may therefore opt not to use this system but this should not be taken as evidence of disengagement.
User guides
- 'Checklist' tool
What does it monitor?
- Checklists can prompt students of things to do or include when preparing for an action or piece of work e.g. a placement, seeking ethical approval, fieldwork, an assessment etc.
- Increases likelihood of students completing listed items.
- Staff can see what items individual students have ticked and/or added.
How does it work?
- Staff can add Checklists to a course in which students can tick off listed items.
- Checklists can be configured so students can add their own items. Useful for research modules, or spaces in which a sense of student ownership is beneficial.
- The report displays a table listing all the students enrolled on a course in the first column, and all the items on the 'Checklist' presented as column headings across the first row.
Interpreting the data
This tool allows staff more granular oversight of individual students. Staff can instantly see how their class is progressing with items on the Checklist. (See the Adding and managing checklist items section in the Checklist mini guide). However, students can choose whether to use this tool and to what extent so material evidence of engagement with an item listed in the Checklist, e.g., an assignment submission, or a forum post, may be a more reliable barometer of engagement with that action.
User guide
- 'Group choice' tool
What does it monitor?
- Gives students the freedom to choose a group and the potential to switch groups. This can provide an opportunity to see if students are sufficiently engaged to join a group and consequently to engage in group activities.
- Groups don’t have to be just for formal group work or projects. They could have a social aspect e.g. for encouraging students to form mutually supportive study groups.
How does it work?
- Staff can create empty groups and students can choose which one they want to join.
- Staff can set limits on the number of students who can join each group.
- Access to activities and resources can be restricted by group membership.
Interpreting the data
Any response indicates a degree of engagement on the part of respondents. However, you may be able to read more into students’ group choice or changes to their choice based on your knowledge of the students.
Where students have not joined a group it possible to prompt them via individual emails (N.B. Quickmail could be used to do this).
User guide
- 'Hot question' tool
What does it monitor?
- It allows students to actively engage in class-wide discussions by asking questions or making comments anonymously.
How does it work?
- When staff add a Hot Questions activity to their course(s), students can contribute questions/topics and/or vote for their peers’ contribution(s).
- The more votes a question gets the ‘hotter’ it becomes and the higher it rises in the list.
Interpreting the data
You can clearly see what students want to know or what they think by the questions or thoughts they post in the Hot Question tool. In addition, students can get an idea of how many of their peers would like to know the answers to these questions or concur with their comments from the number of votes they get.
Whereas anonymous posts may deny staff the information they need to reach out to individual students, facilitating anonymous engagement may increase participation by reticent students.
User guide
- 'Feedback' tool
What does it monitor?
You can take occasional “temperature checks” throughout your module, including on how students rate their own engagement with the course.
How does it work?
Staff create a feedback survey on their course with questions that they create. (N.B. This tool is not typically used for centrally managed module evaluations. See Online Student Evaluation Questionnaires and Harmonising module evaluation for more on Student Module Evaluations).
Interpreting the data
Staff can see both individual responses in detail, and a graphical (and text) display of aggregated responses.
Engagement can be gauged both by the number of respondents as well as by the responses given. The effectiveness of this tool for understanding the why’s and wherefores of students’ engagement along a particular learning journey is therefore largely dependent on the questions composed.
User guide
- 'Choice' tool
What does it monitor?
- Can be used to ask students subject related or administrative questions. E.g. Rather asking students to reply by email to a question with pre-set responses, ask student to complete a Choice poll.
- Easier for students to click on a link and select a response in a Choice, than to write an email response.
- Staff don’t have to collate email responses and figure out who has yet to reply.
How does it work?
- Staff can pose a question or poll (in the form of an MCQ) and students can select one or more of the choices or options given.
- Staff can configure the Choice tool in a variety of ways. Including ones that determine whether staff and/or students can see the identities of participants.
Interpreting the data
Interpreting the data generated by the Choice tool depends on how it has been used or the question students have been asked to answer. For example, if may be used:
- as a quick poll to stimulate thinking about a topic;
- to facilitate student decision-making, for example allowing students to vote on a direction for the course; or
- to quickly test students' understanding.
In these cases, any response indicates a degree of engagement on the part of respondents. However, the author of the poll may be able to further evaluate the ‘quality’ of students’ engagement by their specific choice(s).
Where students have not participated it is possible to prompt them via individual emails (N.B. Quickmail could be used to do this). Or perhaps aa quick ‘temperature check’ survey of these students, could be done using the Feedback tool, to sensitively explore their lack of engagement. In some cases, a conversation in class or a forum that facilitated anonymous posts may be appropriate. (See: Discussion Forums > How do I…? Set up an Anonymous Forum).
User guide
- 'Moodle Reports' tool
What does it monitor?
By regularly reviewing Moodle Reports you can:
- gauge the relative popularity of resources
- see the pattern of student usage
- get a senseof how and when students engage across the course.
How does it work?
Most useful reports for this purpose: Live Logs, Activity Reports and Course participation. See UCL Moodle wiki for information on how to access and use.
Logs – are useful for displaying lists of all the actions one/all student(s) have undertaken in this course within the past year. They are often used to see if or when students submitted work or engaged with an activity or resource.Activity reports – show the items in a course on a single page, sorted by topic/week number. Each item in the report displays with the:
- number of times it has been viewed,
- number of viewers,
- date and time it was last accessed,
- number of days and hours since last access.
Course participation reports – will show all actions for selected activities and resources on your course.
Interpreting the data
Although these tools can readily identify students who may be recorded as having viewed resources or activities, these simple reports may not on their own convey how ‘meaningful’ that student’s engagement was. As in the case of 'Acitivity Completion' data, they should be viewed as a starting point to facilitate an individual or group discussion with your student(s) about their level of engagement.
In the interests of transparency, it is important to let students know at the outset that you can view these engagement monitoring tools.
User guide
Further help
- Contact any of your education support named contacts for advice on designing, structuring and delivering your course, etc.:
- Your Faculty Learning Technology Lead (FLTL)
- Your Department’s Learning Technologist
- Your School’s Digital Education Advisor (DEA)
- Useful UCL resources include:
- The Connected Learning Baseline: module leaders should consult and align your course(s) with it.
- ABC Learning Design high-speed workshop that results in a collaborative course/programme design. Check for upcoming events.
- Encouraging student engagement with blended and online learning [Teaching Toolkit]
- UCL Moodle Staff Guide
- Click to view references and further reading
Ahern, S. (2020) Moodle: How do students use yours? [online]. Available from: https://blogs.ucl.ac.uk/digital-education/2020/04/15/moodle-how-do-students-use-yours/ (Accessed 23 August 2021).
Center for Teaching, Vanderbilt University (n.d.) Classroom Assessment Techniques (CATs) [online]. Available from: https://cft.vanderbilt.edu/guides-sub-pages/cats/ (Accessed 23 August 2021).
Digital Education UCL (2021) UCL Connected Learning Baseline [online]. Available from: https://www.ucl.ac.uk/teaching-learning/publications/2021/sep/ucl-connected-learning-baseline (Accessed 23 August 2021