Automated System Testing for a Learning Management System

Lukas Krisper, Markus Ebner, Martin Ebner


Over the last years software development life-cycles have continuously been shortened and new releases are being deployed at a more and more frequent level. In order to ensure the quality of those releases, a strong shift towards automated testing at all testing levels has become noticeable throughout the software devel-opment industry. At system testing level, the scope of testing is the developed product as a whole, tested in a test environment that has a very close resemblance to the production system. Because of this system-wide scope and the many po-tential sources for failures, the implementation of automated tests at this level is challenging. Exhaustive testing is neither feasible nor maintainable, therefore proper designed test cases that cover important functionality are essential. Due to increasing laws and regulations on data protection and data privacy, proper man-agement of test data used in automated testing is as important. This paper dis-cusses how automated system tests for TeachCenter 3.0, Graz University of Technology’s learning management system, were implemented.


automation; system testing; regression; learning management system; test data; test cases

Full Text:


Copyright (c) 2020 Lukas Krisper, Markus Ebner, Martin Ebner

International Journal of Emerging Technologies in Learning (iJET) – eISSN: 1863-0383
Creative Commons License
Scopus logo Clarivate Analyatics ESCI logo EI Compendex logo IET Inspec logo DOAJ logo DBLP logo Learntechlib logo EBSCO logo Ulrich's logo Google Scholar logo MAS logo