Best Practices
Last updated
Last updated
Recommended prior reading material: Course Activity Trends.
Taking a closer look in the Activity Trends graph, can shed light on how users learn overtime.
In the above graph we can focus on the active actions users made in a course, such as comments and replies (collaboration) and see how these actions increased the engagement - not only in the days that the discussion took place, but also in the following days (as users got back to the video, read the comments and replies, clicked on the time tags, added notes etc.).
At the User View we can use the Activity Trends to learn about a specific user - his activity throughout the course and his learning habits. We can even see how his learning advanced in the course over time.
The Activity Trends graph helps us identify learning patterns by acknowledging dates that had extremely high activity (before\ during\ right after a lecture? before an exam?) and when there was low activity (semester break?). In case we spot low activity when it was not expected, perhaps a comment on some videos, an assignment or a task can re inspire learners to become active.
Let's take this user's activity for example:
From this user's activity trends graph, we can see that about two weeks after the course began (at around Aug 15th ), he increased his level of participation - adding comments (green) and taking notes (orange) on a regular basis. At some point, during course break (Oct 1st - Oct 15th), Hen's activity decreased a little - but not completely. Towards the end of the semester (starting the 2nd half of November), his level of collaboration decreased, however - he did remain engaged until the end of semester (end of November).
Recommended prior reading material: How Engagement is measured.
A user's engagement indicates active learning, meaning - the user did not only watch the video, but actually had a meaningful interaction with it - either by creating content and\or by consuming it.
In the above example of User View (Gili Cohen's activity in "Demo" course):
The course includes 22 videos
Gili watched 4 of them – 123 times (for all 4 videos combined)
She finished 12% of the total course content of the 22 videos.
In fact, every time she played a video, she only watched small portions (Average Completion Rate), 10% on average.
Gili got 3 "Educator's Thumbs up" in the course for the 10 Comments and replies she wrote (30% is not bad).
She has 102 engagement score indicating a high meaningful activity in the course.
3 of Comments were replies to others indicates level of collaboration and knowledge sharing.
She liked 9 comments written by others.
Finally, she wrote 12 personal notes which can be characteristic of learning (summarising, bookmarking, etc).
Recommended prior reading material: Course Videos.
One of the greatest features of the tables in the dashboard is the fact that the columns can be sorted.
This way we can get a better understanding of “the top...” well.. everything!
Sorting the Course Videos table by the Collaboration column will place videos with most collaboration at the top. High level of collaboration might be an indicator that the video requires additional attention:
It might be extremely interesting and caused a vigorous discussion by the learners. In this case you might be interested in taking a part of the discussion as well.
There might be a part in which the users (all of them, or specific learners) are struggling and it may be helpful to offer your help, provide a different explanation, or share examples.
Sort by Engagement to see what videos were the ones that got the most “attention” from learners. Note that the most engaged video is not necessarily the one with the most collaboration, meaning: the content of the discussion is what increased the engagement (and not the number of comments) - might be interesting to take a deeper look in this video.
Sorting the table by the “Users” column will order the videos by the number of users who watched it. It may be interesting to see which videos were not viewed by many users - is it because this video was added at the end of the semester? It's worth investigating further, maybe this video has high drop rate (See Sort By Average Completion Rate below).
Videos with low Avg. Completion rate - means users watched only a little of the video every time they viewed it. It might be worthwhile to understand why - was the video hard to understand? was it not engaging enough? was it too long?
Recommended prior reading material: Course Users.
Sorting within the users table also means filtering. Use this feature get a better idea of how many users participated in specific type of activity.
Let’s take a demo course as an example:
In the above example we see there are 19 users in the course, 47% of them collaborated. If we scroll down to the users table and sort by “Collaboration”, we will see exactly how many users collaborated and who they are:
Presence: Sort by Views to see only users that played at least one video. If someone is not on the list - he has not watched any of the course videos yet.
Attention: Sort by Views, if a user has a radical number of views (much more or much less than others), it might indicate on struggling with the content. Diving into User View, can shed more light by looking at Completion metrics, especially The Average Completion Rate.
Learning: not only the users that collaborate are the ones that are learning - they are simply the ones that are “heard”. Note all the users that have high Engagement but low Collaboration and the ones that have written many personal Notes - they also had meaningful interactions, they just might be shy. Diving into User View, can shed more light by looking at Completion metrics, especially at the ratio of Average Completion rate vs Total Completion.
Sharing Knowledge:
The Thumbs up column indicates how many Thumbs Up (= Instructor’s “likes”) each learner received. This can be used as extra credit, or bonus points for the learners.
The Replies column can also serve as an indicator to learners who not only take an active part of the discussion, but are actually helping others (replying to other questions). This can also be used as extra credit, or bonus points for the learners.
Once exported, you will receive a zip that contains the following CSV’s:
Overview – the highlights of the course.
Users – Course Users activity (with additional fields such as: user email, votes, last login date).
Videos – Course Videos.
If you need any additional information or have any questions, please contact us at Annoto Support
Recommended prior reading material: What can we learn from Users table.
The exported Users table provides comprehensive analytics that can be used for grading, for example:
Participation - learners can be graded for the comments\ questions\ ideas\ answers & thoughts they are sharing:
Collaboration - for example: writing over 5 comments throughout the course grants extra credit
Replies - for example: writing over 3 meaningful replies throughout the course grants bonus points
Quality
Thumbs Up badges - for example: a learner that got more than 5 Educator's Thumbs Up gets extra credit
Course Completion - learners can be graded based Views, Watched video and Completion ratings (available in User View).
Course analytics data can be exported, and dowloaded in CSV format. For more details please refer to Exporting Course Data: