Times of day, submission dates, learning statistics, interaction charts – is this what is meant by “learning analytics”?

Article first published on SeOppi 2/2018.

Text: Kaisa Honkonen & Leena Vainio, Association of Finnish eLearning Centre

Leena Vainio studied for the Adaptable Learning Paths project what learning analytics tools are available in current electronic learning environments and how they are used. She received responses for her survey from the representatives of sixteen system suppliers, three learning material producers and six educational institutions.

For quite some time, learning environments have shown us statistics on the use of the system and materials. Such information has been essential for pure online courses, and particularly so for their development, whereas the same information has often been only nice-to-know for classroom teaching.

However, learning analytics comprise much more than learning statistics. We use learning analytics to combine bits of information in order to bring up the critical issues that affect the learner’s progress – the learner may remain completely unchallenged or he or she may be struggling with assignments which are much too difficult, or the situation may be somewhere between these extremities.

At its best, learning analytics support the learner in a timely fashion and guide the learning process towards its goals. Learning analytics alone cannot support learning, and the support of teachers, other learners, parents and workplace instructors is much in need. Analytics provide information that helps the parties think together and find the relevant strengths and points to develop. Properly used, analytics help teachers by giving them new methods to guide learning processes and to support individual, unique learning paths.

The true value of learning analytics is seen when we can use them to help students understand their own ways of learning. How do their own efforts show in the progress of their studies? How have various interactive situations influenced their learning? Active learners could select their next step independently in accordance with their interests from among materials offered by the teacher or an artificial intelligence. Is a task best done alone, or would learning be easier through peer learning? Will knowledge grow best if the student works alone, or together with someone else?

Most often, electronic learning materials are put together by individual teachers and the same teachers decide which materials they wish their students to turn to next. Adaptive collections require very different levels of learning materials in order to cater to the needs of different types of learners. Individual teachers working alone are far from being able to make their materials collections adaptive, but by combining forces and working together, they could make versatile materials and exercises more quickly and for varying situations.

The survey showed that there is no onestop-shop application available. The systems all have their strengths, and combining them according to the situation might bring the best results. It is good if we try out different ways, talking to one another about what we actually look for. Most importantly, we should consider what we intend to do when our analytics highlight a problem. What is our action plan? What are the resources for us to tackle the problem? When can the teacher, working alone, provide support, when do we need counsellors and other support persons, and when do we need an artificial intelligence? Just as we need different learning materials and exercises for different learners, we need different guidance methods for different situations. We need multidisciplinary support teams to help an individual teacher, formed dynamically according to the circumstances.

The novel feature in current learning analytics is the way they make the possible problem spots visual for learners themselves. Learners obtain a better picture of the total situation. In learning analytics -based pedagogy, it is more important than before to agree on the goals with each individual learner. What will we practice next, and why? How will we apply this learning later? We start to build knowledge together and select the necessary tools together so that we may reach the goals.

The General Data Protection Regulation GDPR poses certain challenges for the use of learning analytics, but with the appropriate authorisations, we may collect and use data. We might even take sleep and activity data from smart watches and compare them to learning outcomes. Our smart watch might suggest that we take a nap today in the early afternoon so that the Swedish class later in the afternoon would go better. Would that be learning analytics or wellbeing analytics?

Our concluding statement after the survey is that the development of learning analytics will require a great deal of national-level discussion. Similarly to the debate on artificial intelligence, learning analytics as well require a unified understanding of the concept of human being which we wish to help formulate. What views do we adopt regarding diverse learners and how do we apply the different guidance resources available to us?

At the same time, we also need a shared understanding of the minimum which learning analytics should show us. What features should be available in all systems? What are the minimum requirements; what requirements do we have for data transfer among different systems; what interfaces and registration systems should we use?

Recognise the need. What are the problems to be solved through learning analytics?

Specify. What kinds of learning analytics do we already have in our school? What do we need?

How does the analytics system impact our work culture and management?

What conceptions of learning and human beings are supported by the analytics system?

Strategy and vision.
What skills do we need? The staff’s competences and their training.

How do we share responsibility and adhere to all legal and ethical requirements?

What do we measure?

How do we collect information?

What technologies do we need?

How do we process data, in which forms, and to whom do we disseminate the information?

How do we carry out development actions?

How do we monitor the outcomes?

How do we maintain and enhance the system? Do we, without fail, hold the learner’s best interests as our highest priority?

Mainokset

Learning analytics helping study advisors

Article first published on SeOppi 2/2017.

Text: Maija Kerkola, HAMK University of Applied Sciences

I remember how I, a study advisor, walked the hallways of my university of applied sciences when the semester break approached and looked for certain students, my lost sheep. The tools with which I could have found the possible drop-outs were not many. In practice, I was relying on my eyesight, looking for and hoping to encounter students whom I did not think I had seen at school recently. I mingled and asked teachers and students if they had seen this or that other student. Universities of applied sciences do not enforce compulsory attendance, and it was only that I was worried about certain students and did not think everything was in order, fearing that they might be at risk of dropping out. There is a holistic model underlying this sort of a caring counselling culture. It means that we are genuinely interested in every single student. I find it particularly important that we detect potential drop-outs as early as possible.

Study advisors’ work at universities of applied sciences

Study advisors monitor the progress of students’ studies. In earlier times, students, teachers and advisors met face to face in classrooms and hallways, but now that teaching takes place online, advisory work takes place online as well, and learning analytics is the advisors’ new tool.

The learning analytics system collects data and is able to provide the study advisors with weekly reports concerning students who did not log in or logged in only worryingly few times to the Moodle learning platform. We know that goal-oriented studies require several logins per week. Therefore, alarm bells ring weekly, and the study advisors are notified of the possibility that certain students may be in danger of dropping out. In this way, the study advisors have the chance to act pre-emptively.

Research has shown that study advisors’ guidance work produces long-range outcomes (Helander and Kemppi 2006, 24–25). The importance of guidance is also championed in the writings of Raimo Vuorinen and Maarit Virolainen of the Finnish Institute for Educational Research at the University of Jyväskylä. They refer to recent research and note that guidance reduces drop-out rates and accelerates students’ studies. In addition, in their view, guidance is one of the indicators of the efficiency of a school system. Guidance increases students’ commitment to their studies and helps them clarify their personal study paths. (Vuorinen and Virolainen 2017, 7.)

Now that guidance has gone online, study advisors’ tools include e.g. email, Skype, WebEx, chat, WhatsApp and Snapchat. They are what the advisor will use to reach a certain student. A study advisor’s work involves a great deal of counselling and passing on sufficient and correct information. They introduce various options and assess the impacts of these options on the current points of interest. The idea is for such guidance to help students in decision making; however, study advisors are not providers of services intended to solve problems on behalf of the students themselves (Onnismaa 2007, 23-25).

Sources

Helander, J. & Kemppi, J. 2006. Jos meijät on istutettu tänne jotaki tarkotusta varte: puheenvuoroja hämeenlinnalaisten nuorten osallisuudesta ja hyvästä [If we’ve been made to sit in here for some purpose: contributions to a discussion concerning the possibilities for participation and good life for Hämeenlinna youth]. HAMK University of Applied Sciences.

Onnismaa, J. 2007. Ohjaus- ja neuvontatyö: aikaa, huomiota ja kunnioitusta [Counselling and guidance work: time, attention and respect]. Gaudeamus.

Vuorinen, R. & Virolainen, M. 2017. Editorial. Opinto- ja HOPS-ohjauksesta urasuunnittelutaitojen vahvistamiseen ja ohjauspalveluiden laadun arviointiin [From advising about studies and study planning to improving career planning skills and assessing the quality of counselling services]. Journal of Professional and Vocational Education. 19 (2), 7.

Challenges for learning analytics

Article first published on SeOppi 2/2017.

Text: Lasse Seppänen, Häme University of Applied Sciences (HAMK)

Learning analytics is becoming very popular in student administration. Study performance and students in danger of dropping out can be monitored. If a student’s activity level decreases at some point during the school year, the monitoring of only grades and courses passed would reveal this, in the worst case, as late as the following year.

Our study monitored weekly logins to the learning platform. In the fall of 2016, the Adaptable Learning Paths project at HAMK University of Applied Sciences constructed a system that monitors students’ weekly logins and sends the study advisor weekly messages about students with reduced login rates. The target group was initially the students of computer sciences, but other students were included later.

In computer studies, if a student is to drop out, it most often takes place during the first year, or, alternatively, the student is not able to complete his or her final thesis at the end of the studies. The early detection of the potential drop-outs among first-year students is very important. The thesis process was modified in 2017 to structure it so that it would not be easy to leave it incomplete.

In computer sciences, the learning platform Moodle is in active use. Moodle is in frequent use in all studies during the first two years, and in practice, students must log in daily or almost daily. The algorithms we developed are based on this fact.

The standard working day at HAMK consists of two parts separated by the lunch break. It is natural to think that students would log in to Moodle at least twice each day, achieving a minimum of 10 logins per week. However, students are required to complete a great deal of group work and one group member may submit the work of all others as well. This may lower the login frequency of group members even if they are active in their studies.

Monitoring weekly logins, we first put the threshold at four logins. We thought it a logical conclusion that if a student logs in to Moodle only 0 – 3 times a week, all cannot be well. Later, we tentatively raised the threshold to eight logins and noticed we obtained a great deal of data concerning students who were doing just fine. We then lowered the threshold to six. That gave us a smaller quantity of data in emails, which was easier to manage. It is likely that we will study further the appropriateness of this threshold as well.

We applied the same threshold in our monitoring of the activity of online evening students and observed that the same value does not function properly. For example, we set the threshold at six for the last week of September, and the system generated alarms concerning 24 students. Of these students, only five had zero logins and eleven had 3 – 5 logins. We saw a clear difference between evening students and daytime students: evening students do more during one online session.

Challenges due to the upcoming Data Protection Regulation

The upcoming Data Protection Regulation will bring challenges to the use of learning analytics. Any list containing names of students can be classified as a personal data register – and this is in essence what the weekly email from the learning analytics system entails. The view has been proposed that analytics could be used if they do not impact any single student. However, the whole point of this analytics is that the study advisor be able to contact a single student and to do this easily. In addition, the system profiles students. We must give thought to how we make the system work in compliance with the regulation so that we may continue to support students who are encountering difficulties.

Sources

Oppimisanalytiikan keskus [Learning Analytics Centre]. Mitä on Oppimisanalytiikka? [What is learning analytics?] www.learninganalytics.fi/fi/oppimisanalytiikka

Seppänen, L. Learning analytics call out for action, SeOppi 02/2016

OEB16: Learning analytics call out for action

Article first published on SeOppi 2/2016

Text: Lasse Seppänen, Häme University of Applied Sciences (HAMK)

Learning analytics is becoming a popular function in learning management. Learning analytics targets a multitude of matters such as tracking course evaluations and students’ progress in their courses as well as even monitoring potential drop-out students. This presentation deals with the monitoring of potential drop-outs. Can we detect them from their LMS data before they actually leave?

The study we report in this presentation is being performed at Häme University of Applied Sciences (HAMK). The target group is the first year students in the Degree Programme in Business Information Technology (Business IT). Annually, the intake is tens of students to both daytime and online studies. We can estimate that about 10 students will leave during or after their first year or later, particularly when they find it difficult to complete their theses. It would be important to learn if there were possibilities for helping the first year students and thus preventing the interruption of their studies.

In Business IT, Moodle LMS is in heavy use. That fact forms the background for the algorithms that we created during this study. The use of Moodle should be constant during the first two years of studies. Moodle is used in every course, and students cannot perform well if they do not use it every day.

An academic year at HAMK is divided into four periods of eight weeks. Each school day is divided into two sections: 8:45-12:00 and 12:45-16:00 with lunch in between. It would be natural to think that the students would log in into Moodle at least twice a day, making the total number of their logins 10 per week per person. But the students do quite a lot of group work and it is possible that they follow their peers’ work in Moodle. It is the school custom that one student returns the group tasks on behalf of the whole group. This could lower the login frequency even if the students are appropriately active in their schoolwork.

For the purposes of this study, we selected a threshold value of four weekly logins. We consider this selection a logical one as we know that if a student logs in into Moodle only 0-3 times a week, there is something wrong. That student would be logging in into Moodle for only 30% of studies or less.

The reduction of a student’s activity level during studies would be detected much later in a traditional environment. In the worst case, a student having dropped out might be detected only the following year when the student failed to enroll for the year. In our study, we monitor the first-year daytime students’ logins into Moodle on a weekly basis.

The study was started by analyzing the data of the first year students in 2014. We were able to construct an algorithm and methods that gave us reliable results: we could pinpoint all students at the end of November who would later interrupt their studies. Some students continued their studies past Christmas, but dropped out eventually. The algorithm also detected a student who had taken an unauthorized one-week holiday in Greece.
We have built a login follow-up system during autumn 2016. It sends weekly reports of students with a low login rate to the student counsellor. The system is being introduced as this article is being written. We will make more information available later.

Articles in SeOppi Magazine

The Adaptable Learning Paths project has published three articles in the latest SeOppi Magazine:

  • Multimodality and learning: Increasing understandability and accessibility, Merja Saarela
  • Personal learning paths as visual roadmaps, Kari A. Hintikka
  • Learning analytics call out for action, Lasse Seppänen