Moodle is widely recognised as a leading virtual learning management (VLE) system, with a substantial user base comprising 42% of UK universities. By default, Moodle employs the legacy LAMP stack, typically hosted on a 2-tier large CPU architecture though there are moves afoot to bring more Cloud native features to the hosting environment. With the recent release of Moodle 4.0, which offers significant improvements to the user experience and coupled with the continuous growth of the student base, organisations must ensure that their Moodle system remains up-to-date and performs well, both independently and in conjunction with various custom integrations like Teams, Mahara and Turnitin.
In this blog, we’ll look at why you should performance test Moodle, the risks to mitigate, and the best practices for testing.
Why should you performance test Moodle?
- Moodle plays a crucial role in delivering learning and assessments for approximately 42% of Higher Education Institutions (HEIs) in the UK.
- The release of Moodle 4.0 introduces a major overhaul in user experience, which brings significant improvements to the platform.
- Moodle 4.0’s user experience overhaul and LAMP stack architecture will impact performance and integrations and should have a baseline load test done to understand the performance vis-à-vis the organisations load requirements.
- Risks that need to be mitigated include potential application slowness, security vulnerabilities, connectivity issues, and challenges related to legacy systems during peak periods, such as enrolment, exams, and coursework submissions.
Considering the above factors, it is imperative to thoroughly test Moodle from a performance perspective (in addition to conducting functional testing) to ensure its optimal performance and reliability for your student experience.
What are the typical risks to mitigate with performance testing?
- Connectivity and integrations: Can the system handle increasing demand and maintain seamless connectivity with downstream systems as user load escalates?
- Course feature performance: Concurrent usage scenarios such as course enrolments, assignment submissions and document downloads, have the potential to strain Moodle. Are you confident that your customised instance can handle these scenarios effectively?
- On-premise vs. Cloud performance: Are you certain that migrating to a cloud solution will enhance performance? By establishing baselines and conducting performance comparisons, you can substantiate any performance improvements using metrics for your students and business sponsors.
- Customised content configurations: While a hosted solution promises scalability, can you truly relax without concrete evidence that you have validated this is the case and find out from the students? Performance testing can provide assurance by validating the system’s capability to handle customised content configurations.
- Unknown legacy endpoint configurations: System migrations and integrations can leave behind unused legacy integration points and APIs, which may introduce bottlenecks and hinder usage. Performance testing enables root cause analysis and helps identify and eliminate these potential bottlenecks.
- Login Authentication scheme: The Moodle framework supports various authentication methods, such as Shibboleth, OAuth2, and manual accounts. Implementation depends on the security needs of the universities and should be incorporated as part of scripting and load tests since the number of student base will have an impact on user experience. We have found varying behaviour with Moodle when authentication is integrated.
Importance of test data setup?
Data setup plays a vital role in the planning of load tests as it significantly impacts the test outcomes. It is essential to ensure realistic data setup both within the application and the scripts used. Here are a few examples:
- Login accounts: The input data used to drive the scripts should closely resemble real-world scenarios. For instance, if the goal is to perform a load test with 5000 users, the scripts should ideally include 5000 student logins.
- Course Enrolments and Assessments: When setting up the environment, course enrolments and assessments per student should reflect realistic numbers. Having a large no multiple allocations per student can generate heavy load on the database due to time-consuming SQL queries.
- Files for download: Depending on the specific use case, the data setup may require unique file names for downloads to replicate real-world scenarios. For example, generating feedback files from teachers for download can be a time-consuming task, and this should be factored into the planning process. Automation tools like PowerShell can be utilised to dynamically generate files of varying size, and Moodle’s bulk upload functionality can help expedite this process.
- Login and Logouts: The scripts should be configured to mimic real-time user behaviour regarding login and logout operations. Authentication operations, depending on the implemented scheme, can be resource intensive. For example, a typical student will maintain their logged-in session and rarely log out. During load tests, it is crucial to consider this behaviour and replicate it in the scripts.
By ensuring realistic data setup, you create a more accurate representation of the actual system usage, enabling you to uncover performance issues and optimise the system effectively.
Importance of quality scripting?
Ensuring that scripts during replay accurately simulate real-world students is crucial as a foundational step before executing load tests. To achieve realistic user playback via scripts, consider the following common generic and Moodle-specific standards:
- Server Response: Verify that all server requests (except static content) have checks to ensure the response aligns with expectations.
- Debug Messages: Include log messages in the script to validate correct behaviour during replay while maintaining a format that allows for post-test log parsing. For example, logging messages with a fixed number of columns and a delimiter. This framework can help identify both server and scripting issues due to unhandled scenarios, especially when the system is under load.
- Custom code: Ensure the script includes custom code handling to replicate all possible branching scenarios. For instance, when simulating a user flow involving student submission uploads, consider adding code to handle file deletion before uploading it again during a subsequent attempt.
- File Downloads: Simulating student file downloads is a common user flow in Moodle. Implement appropriate checks to validate the correct file size is downloaded and handle HTTP 304 responses effectively. Failing to check file sizes during download increases the risk of measuring inaccurate response times, as the script may fail to detect server-side errors under load.
- File Uploads: Student assignment submissions are another important flow to measure. Include necessary checks in the script to validate successful file uploads. Typically, the server responds with details of the uploaded file, which can be incorporated as checks within the script.
- User Quiz: Measuring student quiz submissions is a popular user flow in load tests. Consider scenarios where when error occurs it can cause a quiz to remain in an incomplete submitted state and a subsequent attempt will start from where it was left last. Plan for handling such branching with custom coding during the scripting phase itself to avoid errors during load tests.
- User Wait Times: Use dynamic ranges of think times between steps to simulate realistic student behaviour. The appropriate wait times should be agreed upon with the business. Avoid hardcoding wait times, as this increases the likelihood of multiple users hitting the same action simultaneously under load, resulting in unrealistic load generation.
- External calls: Remove any third-party static content calls from the script that are not hosted within the environment being tested.
- Client Caching: During replay, ensure that scripts simulate browser-like playback behaviour for each student. Common performance tools like LoadRunner and JMeter offer features to support this. Incorrectly configured settings may lead to unrealistic load on the server, so it’s important to ensure proper scripts are used.
- Sub Transactions: In typical web-based flows, there are multiple HTTP GET/POST calls that occur within a logical UI flow. While these calls are grouped under a logical transaction name in scripting, it can be beneficial to include sub transactions under the parent transaction to differentiate the response times of different underlying calls. For instance, you can have static content grouped as one sub transaction, while other API calls (non-HTML) that retrieve student-specific info are grouped under another sub transaction. This approach helps isolate time-consuming steps during load tests when the parent transaction exhibits high response times.
- Dynamic Transactions: For scenarios involving assignment submissions and course content downloads, it can be useful to incorporate a mechanism that measures response times based on the file size. This approach makes sense since the response time for a 200MB file submission will be higher than that of a submission involving a 10MB file. By including dynamic transaction names, you can gain more accurate insights into the performance of these actions during load testing.
By adhering to these standards, you can enhance the realism and accuracy of your script-based user playback, leading to more reliable and insightful load testing results for your Moodle system.
What tests should you run and why?
Below are some recommended standard load tests:
- Benchmarking and comparison: To understand ROI and validate success of your Cloud migration or upgrade as well as benchmarking performance for future upgrades.
- Expected load for a business normal day: To measure typical day traffic and understand what can make your instance degrade in performance.
- Load at peak times: Simulate scenarios such as login bursts, time-bound quizzes, and deadline-driven submissions to ensure optimal user experience during peak usage periods.
- Soak tests for 24-48 hours: Perform extended duration tests to identify and rule out any memory leak-related issues that may occur during sustained system usage.
Note, each university will have a unique load profile influenced by factors such as student enrolment, start of the semester, and curriculum demands. Therefore, you should create specific load profiles based on anticipated load requirements and historical usage patterns.
By conducting these load tests, you can gain valuable insights into system performance, identify potential issues, and ensure a seamless user experience for students and other stakeholders.
The typical issues our engineers have found with Moodle in the past are as follows:
Based on our extensive experience working with leading UK universities, we have encountered several common issues. Here are some general examples:
AUTHENTICATION
- Authentication issues related to active thread management that required tuning.
- High login latency in secure authentication schemes like Shibboleth due to incorrect configurations resulted in access issues.
EXTERNAL CALLS
- Slow calls to external services for retrieving assets, leading to performance impacts.
- Discovery of unknown legacy endpoints still being referenced, introducing unnecessary overhead in the system and causing strain on the third-party system.
WEB SERVER
- Non performant customised PHP code contributing to high latency causing unexpected performance impact to students.
- Web server (Apache) struggling to handle workload with max workers configuration, Moodle/Apache failing when attempting to connect to the database.
DATABASE
- Database server overload and exceeding maximum connection thresholds.
- Identification of a significant number of slow queries causing connection pool and CPU saturation, highlighting the need for optimization
- Differences in the application calls to the database having an unexpected impact, platform-related issues
OTHER AREAS
- Inflexible architecture that hampers adaptability and requires significant effort for configuration changes.
- Incorrect quiz configuration leading to unexpected errors and high response times, impacting assessments. For instance, determining the better configuration between having all questions on the same page or on different pages based on the number of questions per quiz and enrolled students.
These examples demonstrate the importance of addressing various aspects, such as authentication, external calls, web server performance, database optimization, and other areas, to ensure a smooth and efficient Moodle system for universities.
What’s the typical Moodle test project look like in terms of timing and manpower?
– Plan around 6-8 weeks to deliver the project with 1 week for planning/discovery, up to 2-weeks for scripting and 2 to 3 weeks for testing/retesting.
– Ensure that load, stress, and soak tests are conducted as part of the testing and retesting process to comprehensively evaluate system performance.
– Engage with platform providers for infrastructure (Co Sector, Catalyst and others) for their co-operation and buy into the independent test process.
– Engage a performance engineering expert who can effectively model load requirements, script user flows, execute and analyse test outputs, and collaborate closely with the IaaS vendor for application monitoring and tuning during performance runs.
By adhering to this approach, you can ensure the project timeline is adequately planned, necessary manpower is allocated, comprehensive testing is conducted, and expert guidance is available to optimize the performance of your Moodle system.
Why engage Infuse for Moodle testing ?
– Infuse brings a wealth of experience in successfully delivering projects for some of the UK’s top universities, ensuring that your Moodle testing is in capable hands
– We have identified many of the risks and have some pre-built configurations and tests which can enable us to deliver a Moodle load testing in 3-weeks to ensure that your students are getting the best experience from your VLE. Typically this would take you 2x the time
– Benefit from competitive licensing prices through our valued partnership with Microfocus, allowing you to optimise your budget while obtaining top-quality testing services which enable us to execute tests through scalable global infrastructure at a similar price to setting up your own Open Source setup
– To learn how we helped King’s College London, Queen Mary, University of London, and University of Exeter with their Moodle integration to Cloud, please click here to view our webinar.
In conclusion, performance testing for Moodle is of utmost importance to ensure optimal system performance and a positive user experience. By understanding the risks and adhering to best practices in performance testing, organisations can effectively mitigate risks, identify and resolve issues, and ensure that their Moodle system is finely tuned for optimal performance.
To further assist you, we have also produced a simple and informative infographic on Moodle Performance Testing which you can click here to view or download.
If you would like to discuss your specific needs and requirements with one of our knowledgeable performance architects, we offer a complimentary 15-30 minute consultation. Please click here and book a quick call with us and explore how Infuse can assist you in achieving your performance testing goals for Moodle.