Incorporating online resources, such as Smart Sparrow tutorials, into teaching opens up access to a suite of learning analytics. Analytics can be used to interpret student understanding and performance in order to improve the resource or to provide better support for students. Apart from the obvious difficulty of finding the time to work through this data, the challenge can be identifying which analytics to use in the interpretation of data.

Dr Justine Gibson and her colleagues Helen Owen, Steven Kopp, John Wright, Katrina Garrett, Ann Thompson, Ricardo Soares Magalhaes and Frances Shapter at The University of Queensland have recently completed a large body of work funded by a UQ Technology-Enhanced Learning Grant that involved developing a series of tutorials across their Veterinary Science program. The tutorials were case-based and covered topics relevant across the first-year undergraduate program before building upon these cases to incorporate more advanced material for later year students, termed vertical and horizontal integration. The project has yielded an excellent detailed guideline for those wishing to develop online education resources either with a team or by themselves. For more information see a draft version of the final report here.

One of the focus areas of the report was the importance of reviewing the analytics after deploying the tutorials to students. The following notes developed and shared with BEST by Dr Justine Gibson and colleagues highlight how to access the analytics in Smart Sparrow and which elements to focus on.

Accessing the analytics

The analytics from each lesson deployment (or ‘activation’) can be accessed via the Smart Sparrow system. To access the analytics navigate to the desired lesson and click the ‘analytics’ tab. Three different categories of analytic information are accessible from this screen: overview, question explorer, and student results.

Guidelines for interpreting the analytics

  1. First review the median time that students spent on the lesson and determine if it is appropriate for the given course. The length of the lesson may need to be adjusted if it is not appropriate.

Example overview analytics

  1. If >15% of students who were required to complete the lesson have not completed it in the given time frame then an investigation should take place into reasons for student non-compliance.

  2. If >10% of students do not complete the lesson, it must be investigated to determine at which point the students stopped the lesson. If a number of students all stopped at a similar question or section within the lesson this should be investigated to determine if the lesson requires altering to ensure student compliance. It could be that an error with a trap state is preventing students from progressing.

  3. The proportion of questions that had an average number of attempts of >2 should be assessed; they should not comprise >50% of the total number of questions in the lesson. These questions should be investigated in terms of the level of difficulty, and a decision must be made as to whether or not the lesson or other course content requires altering to decrease the level of attempts on these questions.

  4. Questions where >50% of all students were incorrect on all attempts, i.e. all of the students were given the answer and moved on, should be investigated for the level of difficulty, with a decision being made as to whether or not the lesson or other course content requires altering to decrease the level of attempts on these questions.

  5. For lessons that contain remedial pathways, the number of students who followed this path must be analysed. If >25% of students follow the remedial path then the lesson should be reviewed to determine if its level of difficulty is appropriate for the coursework.

Summary of analytics to investigate


Dr Gibson has offered to make the Word document templates developed for the project available to anyone interested. These include the storyboard template and standard lesson template. Contact Justine by email at for the templates or with any queries regarding the project.

This work was previously discussed on the BEST blog here and was presented at the 2018 Australian Veterinary Association (AVA) conference held in Brisbane, Australia on 13-18 May (see the Abstract for further details).