Thursday, April 3, 2014

Lessons Learned: A Compliance Training Game


A LITTLE HISTORY
For the last two years I’ve been the consultant for designing a game to deliver compliance training for county-government employees. This is the second year of using a game design that is placed in SCORM-based tracking system used by 28 states. We benefited greatly from lessons learned between year one and two.

WHY COMPLIANCE TRAINING

Compliance training is the required training around such topics as safety, workplace violence, harassment, and diversity. In Learning and Development vernacular, we sometimes refer to it as “cover your organization training” because state or federal law may define the need for training. In addition to the mandate to deliver such training is the coverage of liability. If employees are provided with such training, record exists in cases where employees violate such practices.

YEAR ONE

After our initial conversation with the client in year one,  I proposed a scenario-based online design or a game design. We incorporated the two approaches by providing some scenarios that asked game participants to apply the policy to decisions about the scenarios.As a designer, I bought a game-based template subscription. Though not fully familiar with the subscription, we approached the vendor with some changes to the features of the game we’d selected. They provided a programmer (for a fee) to provide the changes.

The next step was to ensure that participants could register  and be tracked and transcripted in a tracking system.  This required us to make the content SCORM compliant to connect to the tracking system used by the state of Colorado known as CO. Train, used by a total of 28 states. This also required the hosting of content on a server that allowed the connection to the tracking system.


The Registration and Tracking System

In designing the content, attention was paid to how game answers and prompts were provided to reinforce the accurate policies and practice for the compliance topics while the participant played the game. The intention was to support learner success in the game while learning the policy-defined behaviors during game play, rather than separate from it.


The IT division not only served to host and connect to the tracking system, they were pilot participants for the game. They needed to participate in the compliance training, and they were also able to articulate potential barriers with the delivery. After their completion, they then served as the Help Desk for other organizational participants.


Over two hundred employees completed the online sessions.  One of the findings was almost all employees now had accounts in the course tracking system, and would be familiar with logging in for future courses. Additionally, we had figured out how to use the reports from that system for other types of training transcripts.


YEAR TWO

In the initial planning session for year two,  we could see how much we had learned about our process in year one, and how that learning could inform project efficiencies—both in the delivery and the cost of offering this delivery. This was evident with an easy draft of project tasks. At the year two planning meeting, we were easily able to assign names and dates to the task timeline.The initial planning meeting was to discuss what type of game template we might want to use. 

The customers asked for deliveries that could be done online and in face-to-face sessions, as some locations don’t have internet connections or access to many computers. Also discussed was the expected level of knowledge—rather than applying the knowledge, the intent for this training was to confirm that participants had knowledge of the policies.


With fuller knowledge of what was available in the game subscription, we chose similar templates to serve both populations.


The game format reduced the offering of 4 courses to 1.  In the first year, with the many programming and edits we had to do, we ultimately, launched the project 3 weeks behind the projected schedule. In year two, we launched on-schedule, allowing a last test run a week prior to the year-two anticipated launch.


While we still had a few bugs to work out with the test, our de-brief surfaced only a few more issues related to information about Browser versions, and the launch email that was ignored by some because it did not come from the expected sender of the email (Rather than Training, IT sent it.)


That said, the completion numbers were good, the satisfaction level was good, with improved efficiencies.


LOOKING FORWARD

I’d chosen an html5 template for this year’s game with the idea of testing its performance across devices. I was hesitant to test it during this run, but personally tried it on a tablet with success. Upon sharing this information in the de-brief, there was talk about using it with smartphones for a future date. Of course, the visual display would be tested.