Presently I am wrapping up eLearning programs we developed for both NASA and the United States Citizenship Immigration Services. NASA’s eLearning “No FEAR” program is mandatory training taken agency wide; they anticipate that up to 18,000 employees will take it. The program informs employees on their various civil rights and protections working for NASA.
While we have extensive experience with SCORM, 508 compliance, and Learning Management Systems (LMS) integration, we always seem to learn something new once the eLearning program we develop is actually tested inside the client’s LMS to insure interoperability.
In this case, our eLearning program was developed using Adobe Presenter instead of Flash. Adobe Presenter basically lets non-programmers –such as one of our instructional designers – program the content. Presenter does a good job of sucking in video, audio narration, Flash animation, and various pre-built quiz questions we design for the program. By simply checking a few boxes the software in Presenter will automatically SCORM the program and creates all the necessary files inside a manifest so that the program will conveniently handshake with any LMS. Well . . .not really. Anybody who has done a lot of LMS integration soon comes to realize that the promise of SCORM to provide a one-time seamless integration is pie in the sky. Given all the different LMS integrations in which we have followed the proper SCORM tagging conventions, I cannot recall a time when the client’s IT guy sent us back an email notifying us, “No problem, your eLearning passed our SCORM testing on the first try with flying colors!”
What we have come to realize is that government agencies tend to have older legacy LMSs in which the original code has been altered and customized, and this can cause integration problems. In NASA’s case their LMS is called SATERN and is a derivative of an long standing enterprise LMS called Plateau. NASA’s IT tests revealed that Firefox was not bookmarking or showing course completion records properly while Internet Explorer worked fine.
To fix the problem, NASA provided us with custom JAVA script code we had to insert into the program’s html. wrapper. Conversely, a 508 accessibility test revealed that the JAWS screen reader when used in conjunction with Internet Explorer was collapsing the default area in Articulate Presenter reserved for screen readable transcripts. Firefox however, did not encounter the same problem. As a workaround, we programmed s special screen–readable prompt at the beginning of the program that directs the 508 user to press the F key on their keyboard to re-launch the text transcript.
To avoid a lot of headaches and delays down the road, we strongly recommend that if you are responsible for SCORMing your eLearning program for acceptance into your client’s LMS that you always perform an LMS integration test at the start of your program. Waiting until your content has been approved at the very end to start LMS testing can severely delay getting the program deployed and this will come back to bite you.
So, if you are using tools like Articulate Presenter, Adobe Presenter, Captivate, or Flash, initiate a SCO test module even before the content is developed. Basically, what you are trying to ascertain in advance are all the bugs and fixes required to have your eLearning program pass a successful SCORM LMS test. You may have to go back and forth with the client’s IT department a few hundred times (okay I exaggerate a little) before the program passes a successful LMS completion so beware testing can drag on. It may take weeks sending new code back and forth, responding to emails, and having multiple phone conferences with your client’s IT department before your issue or issues are firmly resolved.
By performing your LMS tests early on, you will know how to properly code the final program once the course is approved. Otherwise, if you leave testing to the very end (which feels like the logical step), the likelihood is that your client will be taking a lot of heat from their stakeholders who bellow, “Why isn’t this course out to the field yet? What’s the hold up? This is mandatory training and we are already three weeks behind delivery!”
Even though it may not directly be your fault, your stakeholder’s anxious demands to provide a quick fix to these integration snafus will fall right on your shoulders. Conducting a LMS test way in advance will avoid these problems entirely and will help serve to increase your client’s customer satisfaction with your services, process, technical know how, and ability to deliver.