After creating and deploying a Smart Sparrow online lesson, the next step is to review the learning analytics. Interpretation of learning analytics will be specific to the type of lesson you’ve created. For example, analytics appropriate for a formative assessment will differ from an online exam. Previously, BEST member Dr Justine Gibson shared her guidelines for analysing learning analytics to help our members identify what to look for when reviewing analytics. To compliment these guidelines, we’ve created a video walkthrough of the Analytics tab to guide your through exploring your lesson’s learning analytics and how to help identify potential issues with your lesson. A second shorter video covers analytics for image-based questions created using the Annotate Plugin.

 

Reviewing Smart Sparrow Learning Analytics

The video covers:

  1. What each Filter option is used for
  2. How to judge lesson completion rates
  3. Interpreting the “adaptive feedback in use” value
  4. Are too many of your students taking multiple attempts to get a question right?
  5. An overview of the Solution Trace Graph
  6. Identifying misconceptions that may warrant more teaching time
  7. Spotting screens that may have problems
  8. Exporting data for detailed review

 

The Annotate plugin allows image-based questions to be created on Slice images (see video for question set up). These questions have their own unique learning analytics. This shorter complementary video walks through interpreting the image heat map that accompanies other platform analytics.

 

Learning analytics for image-based questions

 

Watch other help videos we’ve shared on our YouTube channel. If you have any questions about the tips shared here for reviewing learning analytics you can contact me at s.dowdell@best.edu.au, or for more detailed support tailored to your lesson contact support@smartsparrow.com.