Paper—User-Centred Design in Content Management System Development: The Case of EMasters User-Centred Design in Content Management System Development: The Case of EMasters

Including users in design and development of an interactive product is crucial to achieve high level of usability. Content management systems have two categories of users, content creators and content consumers, and designers of these systems have to considers the needs of both user groups. In design of interactive learning systems, special attention has to be given to the process of learning, which means that functional, accessible and usable interface has to serve the purpose of knowledge acquisition. Designing for mobile learning brings additional challenges due to the small screens of mobile devices. The paper describes the process of utilization of user-centred design in development of a simple content management system for learning called EMasters. The aim of the EMasters is to enable teachers to easily create and organize courses which will be delivered to students to facilitate web-based and mobile learning. According to the user-centred design approach, teachers and students are involved in iterative process of design, implementation and evaluation of EMasters. Evaluation study used complementary methods and provided quantitative and qualitative feedback. The usability score reached good level and the guidelines for redesign of the system interface are drown. According to the obtained results, proposed framework is confirmed to be applicable in user-centred design of content management systems in general. In addition, the directions for adjustment of the framework for specific cases are provided. Keywords—Mobile learning, user-centred design, user interface, content management system, usability, rapid prototyping, user testing


Introduction
The concept of learning anywhere anytime is rather old. If we consider textbooks as the first mobile learning devices, as suggested by Searson [1], then learning anywhere and anytime begins with the students reading textbooks on the bus on their way to school. Contemporary digital learning environments support this concept since they are typically web-based and therefore available by any device connected to the Internet. In addition to learning, they bring new affordances such as online testing, communication with teachers and collaboration with peers. The pervasive owning of smartphones, tablets and other mobile devices allow learners to move freely while learning and to communicate with peers faster than ever before. Furthermore, the emerging paradigm of ubiquitous learning brings the dimension of "anyhow" to the plethora of benefits of mobile learning [2], [3], [4]. Recent trends in digital education also include adaptive learning systems, serious games and virtual environments. Research in intelligent and adaptive user interfaces enabled development of personalized learning environments that address user individual differences and deliver individually tailored content and learning paths through the course [4], [5], [6]. Introducing gamification elements into digital learning resources keeps students more focused and engaged in learning [7]. In simulated game-like environments such as 3D virtual worlds, serious games and virtual reality environments, learners can face real situations and learn directly from these first-person experiences [8], [9], [10], [11].
The common goal of all technological interventions in learning, including abovementioned trends, is to increase the ease of use of learning applications and to improve the learning outcomes of learners. These applications are usually highly interactive thus keeping the focus and motivation at learning. All these desirable features are results of application of the principles of interaction design and learner-centred design [12].
Considering specifically mobile learning, today learning applications are competing with prevalent usage of mobile phones for chatting, music, videos and social networks. Learning is no more a single process that is closed in controlled environment such as for example virtual worlds or virtual laboratories [9], [11]. Instead, learning activities are frequently interrupted with notifications from various applications that distract student while learning. Thus, besides the fact that uncontrollable usage of mobile phones leads to decreased learning performance of students in general [13], in case of mobile learning the process of learning is directly affected by the events at the same screen on which the learning occurs. In those circumstances, the importance of good design of learning applications for mobile devices becomes crucial. If the student is frustrated by the learning application he is currently using, the chances that after responding on a chat he will get back to learning are probably lower than in case of pleasant and meaningful learning experience. This fact raises the need for further research on finding more efficient ways of using mobile phones for learning.
The paper presents the process of design and development of a simple content management system for learning called the EMasters. The system is responsive i.e. adapted to small screens of mobile devices such as laptops, tablets and smartphones. The development cycle is an iterative process which is conducted by the principles of user-centred design.
The paper is structured as follows. After explaining the rationale for the study in the introductory section, Section 2 provides a theoretical framework with short review of online educational systems, their strengths and weaknesses. The theoretical section continues with definitions of the terms and methods used in the study. Section 3 brief-ly presents the key points in design and development of the EMasters. Section 4 describes the procedure of conducted evaluation study and the results follow in Section 5. In Section 6 the obtained results are discussed, and the implications of findings are brought. Section 7 provides closing remarks.

2
Theoretical Background

Online educational systems
Many academic institutions and private organizations make continuous efforts and invest significant resources in providing online learning opportunities for their students or employees and network technologies continuously develop new possibilities for successful utilization of online leaning for personal growth [14]. Therefore the pool of learners is now expanded from the "traditional" students to a much broader scope which includes employees in a company or self-motivated learners who choose to attend some courses in their free time for individual development. The most common forms of online education in institutional settings are Massive Open Online Courses (MOOCs) [15] and Learning Management Systems (LMSs), e.g. Moodle [16] as a category of Content Management Systems (CMSs) specially indented for learning. Despite the different definitions and basic functions of these systems, as explained in [17], they aim to provide high-quality education to the cohorts of students in a cost-effective manner. They are widely implemented in high academic institutions, public and private organizations, either to support blended learning or to provide fully on-line education. These solutions are convenient for large institutions because teachers and management have control over learning achievements of the participants. In addition, they may obtain numerous reports on learning analytics, which is the basis for development of new policies and strategies and for progress of the institution.
However, researchers suggest that these institutional systems still do not succeed to meet the requirements. Some of frequently reported weak points are: high costs of development and continuous maintenance; user interface is often non-responsive, i.e. not adapted to small screens typical of mobile devices; students sometimes oppose to mandatory software, while teachers, who are expected to design learning content, are sometimes not ready to take this new role of course creators [18], [19], [20]. As a part of an answer to these challenges, small e-learning applications or microlearning applications have appeared lately as an alternative to learning content in typically large elearning systems [21]. Microlearning as a novel type of digital instruction can contribute to interactivity in learning experience due to a very easy form of application and the ability of simple integration into online educational systems and virtual environments [21].
The common feature of all these forms of educational applications is that teacher/instructional designer need not to have any programming skills to be able to successfully create learning content. According to the definition of the CMSs, which are in focus of this paper, teachers are provided with tools and features for creating the content of online courses and for moderating interaction with their students [17].

User-centred design
User-centred design (UCD) is covered by ISO standards related to a broader scope of human-centred design and usability [22]. ISO 9241-210:2010 describes requirements of human-centred design principles and activities that are related to the usage of computer systems. The standard concerns with ways to enhance human-system interaction through usage of both hardware and software components of interactive computer systems. The standard is intended to be used by professionals who manage the design and development of interactive systems. Figure 1 shows typical stages of UCD according to ISO: • Plan the UCD process • Understand and specify context of use • Specify user requirements • Produce design solutions • Evaluate designs against requirements • Design solution meets user requirements In iterative process of designing, implementation and evaluation several phases are reoccurring as many times as needed to reach the final stage in which design solution meets user requirements to a great extent. For software designers building interactive learning applications, Quintana, Krajcik, and Soloway [23] extended the traditional definition of UCD approach and proposed a definition of learner-centred design. They have considered three dimensions of interaction in learning systems: the audience (users vs. learners), the addressed problem (using tools vs. learning work) and underlying approach (supporting action vs. supporting learning). When designed for learners, educational software must address several unique needs of learners as users: the concept of learning by doing, individual differences and different levels of motivation [24].
Specifically, when designing CMSs, two categories of users have to be considered, namely the content creators and the content consumers. In interactive learning systems, content creators are teachers or instructional designers while content consumers are learners who use delivered content for knowledge acquisition. To ensure that system will be adopted by the users, both user categories must be engaged in the process of system design. The users expect to be able to control the interaction and user enrolment in design process is crucial to achieve this goal. This is especially important in informal learning where there is no institutional incentive or requirement for using specific software. To be adopted and really used, an application for both institutional and informal learning has to provide appealing user experience for teachers and learners.
The UCD approach is focusing on usability [25]. According to ISO [22] usability is: " extent to which a system, product or service can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use". Recently published new ISO Standard for Usability, Usability Reports and Usability Measures, ISO 9241-11:2018 extend this definition and explains that usability should not be considered as a property of a system but as an outcome of use [26].
Usability evaluation is regularly conducted during the evaluation phase of UCD process described in Figure 1. Moreover, evaluation and design are closely integrated in this process and some of the same methods are used both in evaluation and in the phase of specifying user requirements [12]. Methods of usability evaluation include [27]: • Usability testing -an evaluation process with users doing real tasks on a prototype or real system, while carefully observing their behaviour and emotional response during the interaction; also called user testing; • Usability inspection -method in which usability experts evaluate the interface according to a set of usability guidelines, with no user involvement; • Usability inquiry -method of gathering user feedback after the interaction with the system, usually through surveys, interviews of focus groups.
The methods differ in nature and in number of participants. Expert inspection requires significantly smaller number of participants compared to methods that involve end users. Since it may be very hard to find five usability experts to do the evaluation [28], user testing and usability inquiry are more frequently applied. However, to get different perspectives on the interface, research suggests triangulation of methods when possible [12].
Qualitative methods are usually conducted through formative evaluation to check the compliance with user requirements and to provide guidelines for interface redesign [29]. On the other hand, quantitative methods are more often used as a part of summative evaluation to confirm that design has reached high level of perceived usability [12], [29]. In the rest of the paper, we will explain in details the methods which are chosen to be applied in usability evaluation of EMasters, namely the thinking aloud protocol as one of user testing techniques followed by several questionnaires and an interview used in post-session usability inquiry (see Chapter 4).

Design and Development of the EMasters
The aim of the EMasters is to enable teachers to easily create and organize courses which will be delivered to students to facilitate web-based and mobile learning. Teachers register to the system, make courses and organize their content. As part of the course, they may write text and import other objects such as pictures, videos and microlearning applications. Besides learning content, the teachers can make quizzes for evaluation or self-evaluation of knowledge acquisition process. They can update and delete any piece of the course they have created. The application also has a forum section in which both the teachers and the students can start a discussion on a topic and reply to previous comments. In addition to these two user groups, the application has an administrator who is in charge of the system and controls the proper entry of all data.
The first design of the EMasters was made as a wireframe e.g. paper prototype for desktop, web-based and mobile interfaces. In a simplified process of UCD, that involves iteration of design, implementation and evaluation, the pilot testing of the first design was carried out with five students of University in Split, Croatia, Faculty of Science. Three participants tested the wireframe in teacher`s role and two participants in student's role. According to the feedback from pilot testing, the second wireframe was made. Redesign included several changes in teacher's interface and student's interface. For example, buttons for updating and deleting a chapter in teacher's interface were replaced with hyperlinks. This feature is shown in Figure 2, as developed later in the implementation phase. In addition, the forum page, which shows a topic, was redesigned in a way that the field for entering the reply is enlarged and the button for submitting the entry is red instead of green. Figure 3 shows the forum page with the theme "Regular expressions" which is a topic in the "Unix" course. The EMasters application was created in Python programming language using Django framework [30]. Django makes use of Model-Template-View (MTV) application architecture as presented in Figure 4. Model is linked to a database where the data is being stored or retrieved from. This layer contains all the information regarding data storage, data connections, entry limitations, mandatory or non-mandatory data, etc. Template is a presentation of the model, i.e. HTML webpage that contains data with instructions on how the webpage should be displayed. The View in MTV architecture serves as a bridge between the other two layers. The user accesses the server using a web browser, after which Django ensures to open a specific view at the user's request. Finally, the database is accessed, and the data are used to display the webpage in the browser according to the request.  [28] To adapt the interface to devices with different screen sizes, Bootstrap is used as a front-end framework. Figure 5 shows responsive design of a typical content page for teacher, displayed at laptop and iPhone6/7/8. Layout of the same page for students does not have hyperlinks for updating and deleting existing objects nor the hyperlinks at the bottom of the page that enable adding new content and creating a quiz.

Usability Evaluation
Evaluation study was conducted in several individual user sessions with slightly different procedures for participants who accepted the role of teachers and the ones who took the role of students. The study begins with teachers' sessions. The students' sessions are conducted after all the teachers have finished creating their courses in the EMasters. Thus, the students were free to choose a course from the pool of courses delivered by the system. All participants followed the same method in the evaluation study but with different instruments according to their role. The evaluation procedure for each participant begins with user testing technique called thinking aloud protocol, which is conducted with teachers in the processes of course creating and later again with students in the processes of learning in the EMasters. The following steps involve usability inquiry methods (see Chapter 2.2.), namely post-session survey and semi-structured interview. Figure 6 briefly presents procedure of usability study with the steps of individual sessions carried out with teachers and students. Users who accepted the role of teachers were asked to register in the EMasters and to create an online course on a topic of their choice. They were instructed to make learning content with minimum two chapters and a quiz. They were allowed, but not explicitly told, to import different types of external files, such as images and videos. The structure of the course was not precisely set. However, the teachers were suggested to follow the ADDIE model of instructional design [31] in the phases of course design and development. The teachers were able to review developed course by taking the role of student.
In the learning session, students were invited to register in the system and to select one of the actual learning courses. They were free to explore the content and take quizzes at their own pace and in order they prefer. They were also asked to find the forum and write a comment regarding one of the topics.
Both teachers and students were encouraged to think aloud while doing their tasks. The supervisor of the study observes the user behaviour in real time. He notes the user`s comments as well as key aspects of the interaction such as unexpected steps or mistakes the user makes in the process of course creating or learning. The supervisor does not help the users in achieving their goals. After all users' sessions are completed, the evaluator's notebook becomes valuable resource of potential usability problems of both teacher's and student's interface.

Post-session survey
Following the procedure presented in Figure 6, after individual hands-on session in the EMasters, each participant was invited to fill in a survey. The post-session survey was hosted on Google Forms and included three sections: SUS questionnaire, qualitative feedback section and background section. Questions in the first two sections were the same for both of the user roles, while background sections were specifically designed to be different for teachers and students.
The SUS [32], [33] is a standardized questionnaire for measuring perceived usability of the system and is used worldwide for overall usability evaluation of different systems [34], [35]. The reliability, validity and sensitivity of the SUS was confirmed in numerous studies as reported by Lewis and Sauro [36]. In the same paper they have provided a curved grading scale to interpret the overall SUS score and this scale will be used in our usability study. Although Brooke initially limited the interpretation of SUS to the overall score [32], Lewis and Sauro conducted a comprehensive psychometric analysis of the SUS and suggested that SUS results could also be interpreted by individual items [36]. Based on regression analysis they have provided item benchmarks related to overall SUS scores. Table 1 in the Results section of the paper shows all the SUS questions.
The second section of the post-session survey is designed to obtain qualitative feedback from the users. It included open-ended questions: What did you like the most in the EMaster system? What are the flaws? What improvements do you suggest? The users' subjective opinion about their experience in using the EMasters application can reveal major flows in design and help us develop guidelines for redesign of the system interface.
Finally, in the background section of the survey, we obtained demographic data of the users who participated in evaluation study. There were several multichoice questions, related to the role of the participants. The background section in the teacher's survey collected age, affiliation and the level of previous experience in using elearning systems as well as in creating online courses. On the other hand, in survey for students, the background section asked their age, time spent in online learning and specifically in mobile learning as well as previous experience in using e-learning systems.

Semi-structured interview
Semi-structured interview is carried out as an addition to qualitative feedback obtained from thinking aloud protocol and the post-session survey. The interview was conducted individually with each participant, immediately after they filled in the survey. The interview usually begun with several questions related to the answers that user provided in the survey and then continued in form of free dialog. The users had the opportunity to review their experience in using the system and to explain what improvements of the interface they would suggest and why.
Using complementary methods of usability evaluation, namely quantitative and qualitative methods, we can get deep insight in usability issues of the system being developed. The SUS gives us the score which reveals the severity of usability problems while qualitative methods may provide initiatives and concrete guidelines for interface redesign which is supposed to reduce recognized problems in interaction and increase the overall usability of the system.

Results
The study was conducted in the autumn semester 2018 at University in Split, Croatia, Faculty of Science. To obtain potentially wide scale of usability issues, the participants were recruited from students, pupils, teachers and general population. The results are presented separately for teacher's and student's interface.

Teacher's interface
Seven participants volunteered to use the application in the role of teacher/course creator. Age of participants ranged from 24 to 45 with the mean of 37.17 and standard deviation of 7.97. One participant was a student of educational vocations, two participants were high school teachers and one high school pedagogist. Since the application is intended to be used in informal learning, three participants were recruited from general population. Previous experience in creating online learning content ranged from novices (3 participants) to experts (1 participant).
All participants successfully created a course on a topic of their interest. The SUS items along with the mean scores and standard deviations for teacher's interface are shown in Table 1. The scores are ranged 0 to 4. It has to be noted here that even numbered questions have reversed polarity and their score is calculated accordingly, as described in [32], [33]. This means that for odd numbered questions the score in Table  1 represent the level of users' agreement with the statement while for even numbered questions the score represents the level of users' disagreement with the statement. Thus, for example, the score 2.71 on item no. 2. "I found the system unnecessarily complex" means that users rated simplicity of the interface with 2.71 or 67.75%. As a result, the higher score in Table 1 always stands for higher level of perceived users' satisfaction with the EMasters.

SUS question
Mean SD 1. I think that I would like to use this system frequently.
3.00 0.82 2. I found the system unnecessarily complex.
2.71 1.38 3. I thought the system was easy to use.
3.14 0.38 4. I think that I would need the support of a technical person to be able to use this system.
1.86 1.07 5. I found the various functions in this system were well integrated. 3.00 0.82 6. I thought there was too much inconsistency in this system.
2.86 1.35 7. I would imagine that most people would learn to use this game very quickly.
2.71 0.49 8. I found the system very awkward to use. 3.14 0.69 9. I felt very confident using the system.
2.86 0.69 10. I needed to learn a lot of things before I could get going with this system.
3.14 0.90 Overall SUS score for teacher's interface is given in Table 2. The obtained value means that satisfaction of participants who took the role of course creators is 71.1 %.
Considering individual ratings from participants, the minimum individual grade is 2.2 (55%) and the maximum 3.4 (85%). In the qualitative feedback section of the post-session survey six of seven participants wrote positive impressions. Most of them appreciated the simplicity of the interface. The comments also included the "ease of use", "nice and intuitive design" as well as highly functional interaction. Five users reported that have experienced some flows of the EMasters system. They were not satisfied with: restricted possibilities in design of page content i.e. the positioning of objects on a page; the absence of data on students' usage and achievements; and the fact that quiz allows only multiple-choice type of questions. These issues were clarified in individual interviews with each participant. In addition, a user suggested that teacher should be able to edit only his/her own courses.

Student's interface
In the role of student, 14 volunteers took part in the study. Mean age was 28.28, with standard deviation of 10.15, minimum age of 12 and maximum age of 43. Two participants were primary school pupils, two high school students and one university student. The rest were adults with various backgrounds. Prior experience in using computers and mobile devices for learning was almost equally distributed in the sample (rare, sometimes, often, very often, regularly).
All participants registered in the system, took time for learning and taking the quiz on selected lesson. All of them succeeded in finding a forum and leaving a comment. The SUS results obtained from students are presented in Table 3 and the overall SUS score is given in Table 4. Overall students' satisfaction is 68.9%. The lowest individual student's grade is 1.6 (40%) and the highest grade is 3.8 (95%). In the post-session survey, we obtained various feedback from users in the student's role. Four participants did not write comments regarding positive aspects of the application. The same users wrote that they did not experience difficulties in interaction and they had no suggestions for improvement. Six participants liked the ease of use and several students like the aspects that are more related to learning content than to the user interface. Considering users' opinion on shortcomings of the application, most of the students referred to weaknesses in learning content (too much text, not enough images, videos etc.) while two students had difficulties in reading because of the small letter size. Reported usability issues and suggestions for improvement from student's perspective were discussed in post-session interviews with each participant. As opposed to content pages, the users had no objections to the usage of quizzes and forum.

Discussion
According to the interpretation scale for SUS [36], where average grade (C) stands for SUS scores 65.0 -71.0 and slightly above average grade (C+) stands for 71.1 -72.5, we can conclude that users rated the EMasters application as good for course creation (71.1%) and learning (68.9%). Qualitative analysis also confirmed that the application is generally perceived as easy to use for both course creators and learners. Participants of the study appreciated the simplicity of the interface, clear design, availability of the content and the possibility of communication through forum. Discussing the weaknesses of the application, we collected significant number of comments. The obtained feedback is used to decide what comments are relevant for the interaction and can provide guidelines for redesign that could improve usability of the system. For example, enlarged letter size can be used for the text in content pages, or we can provide users with an option to set the letter size to desired value. It is important to communicate all reported issues with the users to reach the final decision on redesign. Thus, the issue of inability to manipulate the position of objects on a content page is considered as less relevant when compared to risk of a possible bad layout on small screens of mobile devices.
Conducted evaluation study revealed some concerns that are not usability problems but also demand consideration. They are particularly related to handling the quizzes and managing the course and its participants. In quizzes, the feature of displaying scores immediately after taking the quiz can be added. Major improvements of the system could include features for teachers such as administering the students, tracking their progress through the course and analyse their achievements on quizzes. Still, the initial incentive of the EMasters was to facilitate informal learning so these major improvements could be considered if the purpose of application is to be used in institutional settings.
Considering the procedure of conducted study, we have to notice that some of the results regarding student`s interface may be the consequence of a poor design of learning content in some courses. Several users that took the role of course creators are not familiar with the principles of instructional design and did not apply any of them. Although these users can contribute in usability evaluation of teacher's interface, especially when developing applications for informal learning, their courses may have major flows related to content quality. On the other hand, the study shows that students who evaluate the interface in learning process sometimes have troubles identifying which problems are related to the usability and which to the content quality. This limitation can be overcome in a way that poorly designed courses are excluded from the second part of the study. This approach requires evaluation of developed courses by instructional design experts prior to the usability study of student's interface. Another solution is to engage only teachers or instructional designers to create online courses in the first phase of the study. Both options support the initial idea of the EMasters to be available for all, which means that everyone can create a course for interested audience on a topic of their own choice. Both solutions also ensure that the high usability level of the system will reinforce the process of content creation and of learning.

Conclusion
The paper, describing the case of design and development of simple learning CMS called the EMasters, presents an iterative process of user-centred design. The process begins with rapid prototyping and pilot testing of the first design. The second design is made according to the users' feedback and the implementation follows. To evaluate developed system, a comprehensive usability study was carried out. The evaluation method combined several well-known and reliable techniques of usability testing and usability inquiry. Quantitative results show that usability score reached a good level. Qualitative user feedback revealed several usability problems as well as other issues related to the content quality. The outcomes of the study provide guidelines for improvements which can be implemented in the next iteration of UCD for EMasters.
According to the obtained results, the evaluation framework confirmed to be suitable for iterative process of user-centred design of CMSs. The study shows that usability evaluation can be successfully applied in design process in a quick and costeffective manner. In addition, the framework can be further adjusted and refined to fulfil the requirements of specific CMSs developed for the purpose of learning thus contributing to the researchers and practitioners in the field of design and development of learning CMSs.