Future works

Future development plans for Koala LMS

The first version of Koala LMS wanted to set basic bricks and show our ambition: collaboration through educational resources and courses. We want to go further and have planned several major projects. These various works are designed to make collaborate students and integrate them into the pedagogical process defined by the teachers.

Table of contents



Implement a free discussion system

With current learning platforms, students can not exchange or debate about the educational resources available. We think that it hurts learning, because their respective questions and arguments are circumscribed.

Language and exchange are excellent tools that help improve individual knowledge.

We want Koala LMS to have a dedicated space, an agora where all students with access to learning resources will be able to chat with each other, ask and answer questions in free form. This agora should not concern only educational resources, but also activities and courses.

Let’s take a case study: a student is enrolled in a history class about World War II. In search of an educational resource related to the subject of the course, the student spots a video negationist, challenging the crimes committed by Nazi Germany. Although criticizable, this element may have an interest in the course as evidence that there are individuals denying historical facts for racist or political purposes. However, introducing it to the course must be done with caution. While students can exchange and debate about this resource, the role of the teacher is to provide a critical, objective, circumstantial and contextualized to his students.

It may be necessary for the type of resource mentioned above to define a reporting mechanism that may result in the transfer of ownership of the course to a validated teacher or an instance administrator.

Create a federation of educational resources

Known and identified resources

Koala LMS is not intended to be a monolithic whole that functions as a data silo. We want to be the opposite of current centralization practices on the Web.

Koala LMS must adopt a decentralized paradigm. Below, a naive example of decentralization as we envision it.

Koala LMS instance communication scheme

The process of accessing educational resources is then modified with respect to the centralized basic operation:

The process continues by querying other instances that may contain educational resources matching the needs of the **A * University client:
  1. The client located in A University sends a request to the Koala LMS platform on which it is connected.
  2. Koala LMS searches his database for educational resources matching the query.
  3. Koala LMS searches in his cache for other educational resources that match the query.
  4. Educational resources are usually provided with additional files. They are stored on the filesystem.
  5. University A has found a resource corresponding to the client’s need, located in another instance, Secondary School 2.
  6. The client A instance sends a request to this other instance to get the requested resource.
  7. The instance of Secondary School 2 searches in its database for the requested resource.
  8. The instance of Secondary School 2 retrieves the files attached to the Learning Resources.

Finally, the University instance retrieves the data from Secondary School 2 and stores it in its cache until the expiry as defined by Secondary School 2 .

Information


A trainee has been working on these aspects of decentralization for eight weeks, and the technologies that seem appropriate for this use case are the Restful API for communication between instances, and the LOM (Learning Object Metadata), an XML schema describing educational resources.

Unknown resources

The previous proposition poses a strong dependence constraint between all instances. This means that to retrieve a learning resource from a Koala LMS instance IB, the IA instance that originated the query must know IB. This is not possible in a context like ours where instances can be installed and removed.

We propose a solution based on a central index replicated on each running instance. This index stores attributes of educational resources (name, keywords, etc …) useful for the search and the URL to access them.

Schema of the organization of resource sharing with an index

Here, each instance comes with an index. It is synchronized with one or more reference indexes that are addressed in the domain name system, for example index1.koala-lms.org,index2.koala-lms.org, and so on.

Adding a new instance

When adding a new instance in the federation, the operation might be:

  1. The new index connects to the reference indexes to report its existence.
  2. Reference indexes give instructions on how to synchronize in the future (frequency of synchronizations, etc.)
  3. The new index synchronizes with the reference index (it might be useful to spread this synchronization load with other existing indexes already up to date).
  4. The index updates (new instructions and data) according to the guidelines given by the reference indexes.
In this context, an instance that is not synchronized for any X time will be treated as offline and will no longer be considered by other instances as a viable learning resource repository.
Resource Location

Now, what to do when a user from the University A instance of the previous schema runs a query to get an unknown resource?

  1. The Koala LMS instance on which the user is logged in queries his local index for a learning resource that matches his needs.
  2. After identifying the URL of the Learning Resource, it can apply the same method as described in Known and identified resources.

Communication between users

User discussions about learning resources are processed and stored by the instance that hosts the resource. The ActivityPub protocol can partially answer this problem.

Recommend adapted teaching resources

Koala LMS is a utility for recommending educational resources. We want to be able to identify the best moments for learning in the learner’s path and offer him or her appropriate resources.

The analysis of digital traces of learning makes it possible to understand users’ habits, their level of progress in the courses to offer them adapted teaching resources.

A simple use case is that of the course recommendation which comes to supplement the knowledge already acquired, or to fill in the gaps identified by the tests carried out. De facto, the proposed educational resources are specific to the student. It is through his interaction with the tool that he receives adapted recommendations. The recommendation comes from his profile and traces left during learning.

Two types of recommendations exist: teaching resource recommendations within courses and course recommendations. If a student begins a learning process with a thematic course, the assessment results may indicate that there are gaps in some areas. The referral system offers courses and content to solve these difficulties. The recommendations are made to the teachers when they build a new course: public educational resources are indexed and can be reused to build new courses.

To make learning more effective, students must be available. This means that they are able to acquire new knowledge. One of the goals of Koala LMS is to identify student availability. It is according to the actions of the user at a given moment that the resources are proposed to him. The ambient sensors on mobile devices can be used to determine contexts and propose appropriate resources.