Creating a Test Plan for Elearning Development

“If you fail to plan, you are planning to fail” is an old adage that applies to quality assurance (QA) in elearning development. For many elearning projects, QA amounts to testing the courseware after it’s developed with little forethought given to how the testing would be conducted. Subsequently, these projects often release courseware replete with quality problems. Last week, I presented an approach to conducting QA throughout the courseware development project. However, the complexity of this approach means it won’t be effective without planning. This week, I’ll discuss creating a test plan for this approach.

Because QA begins early in the project—usually in the design phase—the test plan should be created as early as possible. The development team is the target audience for the plan, most critically, the project manager and QA team. The plan prepares the project manager to line up resources with the required competencies to do testing at the right time. It documents the following elements:

  • Quality controls (QC) to be performed during the project
  • Resources to be applied in QA
  • Schedule for each QC
  • Test scripts

QCs

Let’s take a close look at how to document each of the elements in the various QCs the elearning will undergo.

Proofreading and editing

I discussed six QCs to perform during development in last week’s article. The first was proofreading and editing. For this QC, members of the typical QA team are unlikely to be the best resources to apply to the task. You’ll instead need people who are very competent with spelling and grammar and have strong attention to detail. For highly technical courseware, you might also need a subject matter expert (SME) for their knowledge of industry-specific terminology.

Although it’s not proofreading and editing in the strictest sense, this is a good point at which to have SMEs review the storyboards to validate the accuracy of the content. Ideally, most of the proofreading and editing can be performed near the end of the design phase so it can be done in Microsoft Word. An editor will be challenged to edit within a development tool because they are unlikely to be proficient with it. However, minimal editing will usually also be needed in later stages of the project. Document at what point the bulk of the proofreading and editing will occur and who will be doing it in the test plan.

Prototyping

Prototyping will require developers to create the prototype. They need to be proficient with some development tool to produce partially functional courseware and have experience with user interface design. Don’t forget to include the customer as a resource in the test plan because they will need to review the prototype and provide their feedback for this QC to be effective. Prototyping usually occurs near the end of the design phase and the beginning of development. At this point in the test plan, the test scripts are still not needed.

Unit testing

This QC is the point at which what is traditionally thought of as the QA team gets involved in the project. Although some technical proficiency is typically required for effective unit testing, the QA team probably doesn’t need to be developers. Be sure to also document non-human resources required for unit testing in the test plan. For example, if the unit is a sharable content object, this is the first QC that will apply a SCORM Conformance Test Suite. The test plan also specifies any other software (e.g. screen reader for Section 508-compliant courseware) or hardware resources that will be used. The courseware’s system requirements spell out which browsers and devices will be supported, so unit testing should be performed on all the platforms. This could include hardware like Macintosh and/or PC, tablet and/or smartphone—don’t forget to specify if they’re iOS or Android—a sound card for audio, and any lab equipment with which the unit interfaces.

Unit testing usually occurs throughout the later portion of the development phase as each unit is completed. In some cases, this is a good point to make another editorial review of the content to ensure no misspellings or grammar errors were introduced by the courseware developers. This is also the first point at which test scripts are required. However, because test scripts are so critical to effective QA, I’ll elaborate on them at length at the end of this article.

Alpha testing

Alpha testing can usually be performed by the same QA team that did the unit testing because it requires similar competencies. However, SMEs might again get involved to make one last review of the content and ensure that it retained its accuracy after undergoing development since the previous time they reviewed the content in the storyboard stage. For SCORM-conformant courseware, a SCORM Conformance Test Suite will again be needed. Be sure to document any lab equipment required if the curriculum includes labs performed outside the courseware. Alpha testing is performed at the very end of development or, if the courseware must be published to an LMS to perform the testing, at the beginning of the implementation phase. This QC also requires a test script.

System testing

This is the final QC in which the QA team is involved. An LMS is another resource that will probably be needed, so the test plan would also need to identify an LMS administrator to participate in testing. Document any lab equipment required for curricula that include labs. You might also need an online assessment testing engine or a certificate generator if these features in the LMS do not meet your requirements. System testing is performed at the end of the implementation phase. This is the final QC that needs a test script.

Beta testing

You should use people from the curriculum’s target audience for a pilot or beta testing. Beta testing also requires a questionnaire, so an instructional designer will be needed to create it. Beta testing occurs at the beginning of what is thought of as the evaluation phase, when it’s a distinct development phase (as opposed to occurring throughout development as in the ADDIE model I presented last week), or when the curriculum is first put into production. However, if the beta test finds that significant modifications are required, before releasing the curriculum to actual learners, you will need to iterate back through many of the aforementioned QCs as regression testing focusing on the components that were modified.

Test scripts

In most elearning development projects, the QA team is directed to the courseware and asked to review it and report back their findings. However, it is typically unstructured. The QA team is given vague and ambiguous instructions on how to review it and what to report back. Subsequently, their efforts are often ineffective in terms of improving the quality of the courseware and duplicative with other members of the team. The best way to avoid this is to document specific test scripts as part of the test plan.

The test script includes the following elements:

  • Detailed step-by-step instructions on how to perform each QC
  • A comprehensive list of all components to be tested in each QC
  • Specific criterion (or criteria) that constitutes a success and/or fail for each component
  • A form to report the findings of each test in detail

The step-by-step instructions are what provide the control in “quality control.” Absent unambiguous instructions, you can’t be sure what the testers are testing and how they are testing it. The QA team often has little involvement in the instructional design (ID) and development, so they are sometimes unaware of what is important to test. Furthermore, the instructions (as well as the other elements of the test script) reduce the frustration of the QA team.

In Validating Curriculum Documentation for Quality Assurance, I alluded to the value of the ID documentation to the test plan. Refer to the ID documentation when creating the test script to ensure you test every functional specification. Some components might be in the content, such as a video, interactivity, image, or hyperlink, and others could be external to the content, such as navigation controls, a glossary, or supplementary PDF resources. For SCORM-conformant courseware, include the elements to review in the SCORM Conformance Test Suite. When conducting system testing, some components to test are not even part of the courseware, such as learning management services and lab components.

The SCORM Conformance Test Suite reports success or fail based on the SCORM specification. For all other components, the test script specifies the criteria against which to test them based on the courseware ID and requirements. For example, a video playing when a page is displayed qualifies as a success or a broken hyperlink qualifies as a fail. Every component listed should have a corresponding success/fail criterion documented in the test script.

The form for the tester to report their findings comes directly from the previous two elements. It lists every component identified in the script. Adjacent to each component, place a success and a fail box for the tester to check according to their findings. Also place a field for brief notes next to each component. Include a large text box at the end of the form for general comments not specific to a single component. Then distribute the form and the step-by-step instructions for each QC they will perform to the QA team so they can test with consistency.

This article only covered how to document the test plan. Refer to End-to-End Courseware Quality Assurance for the details of the specific activities to document in the plan. It’s a significant effort to create a test plan but your hard work will pay off in high-quality elearning.

View 5 comments on “Creating a Test Plan for Elearning Development

Leave a Reply

Your email address will not be published. Required fields are marked *