Corporate Learning - Methodology

ADDIE

When do we use it?

This is the method we follow when requirements are clear (or can be clarified quickly through fast front end Analysis). Here:

  • performance on the job closely mimics the training situation (near transfer)
  • processes involves simple and not complex coordination
  • the on job performance aims at learning facts and concepts
Analysis

At KQ, we break up the Analysis Phase into Performance Analysis (PA) and Training Needs Assessment (TNA) stages. Front end Analysis of obtained data is an important and concurrent part of this phase and continues to the next phase - Design.

Performance Analysis

Performance Analysis answers how best, business goals can be met and whether training is part of the solution. Therefore it can address both performance opportunities and performance problems. For example, it can be done by the team carrying out process redesign and who now want to align people, systems and technology to the new process.

Frequently, consultants and implementers forget what Thomas Gilbert said in his Behaviour Engineering Model (Gilbert, 1978) that process, team and individual performance problems (and opportunities) occur due to many reasons. These include environmental (information, resources and incentives) and individual causes (capacity, motivation and knowledge). It is only the problems (or opportunities) caused by the last factor that can be solved by training (L&D). When Performance Analysis is carried out by a cross functional team, the opportunities afforded by both environmental and individual factors come out clearly.

Performance Analysis is often not done because there is almost always not enough time. Everyone (client, consultants, implementing teams, etc.) wants to get into preparing training material. This may satisfy legal and regulatory requirements but may not solve the performance issues or opportunities that prompt the training in the first place. This is a mistake. The largest opportunity for focussing resources in a well targeted manner occurs only at this stage. What makes this worse is that clients do not even realise the opportunity that was lost. This will happen later when anticipated results do not show up. Performance Analysis provides the only opportunity for doing the right things.

Training Needs Assessment (TNA)

From the TNA stage onwards, we begin a further dive inside. TNA focuses on determining (a) the current state-, (b) the desired state and (c) the type of business issue the need arises from.

The most important answers from TNA include:

  • Who needs to be trained?
  • On what content/ domain areas is the training required?
  • How is the training to be delivered?
  • What would constitute success from the training perspective?

Here we sacrifice breadth for depth since we now move into the realm of training and instruction. We focus on the details required to produce a training program. The program will need to produce trained personnel who will achieve individual, team and organisational goals and hence significant detailing is required at this stage. The outputs from this phase defines Performer objectives, the learning architecture (and associated methodologies), the tools of training as appropriate for the goals - based on learner profiling, instructional approaches, methods and media. The artifacts from this phase form the basis for further stages/ phases.

We focus considerable effort on this front end work (defined by the two phases of Performance Analysis followed by Training Needs Assessment) because it is often the critical component which decides whether business needs will be met. If this is done poorly, designers and developers create material that train personnel on things which make no difference and is divorced from the business itself. At times, this front end phase is done at the client side internally by very competent Internal Performance Analysts/ technologists. They then hand off the results of their study (artifacts) to designers and developers who then build the training system.

We are comfortable with either approach – i.e. of doing the front end Analysis work ourselves or using the work products of an internal team appointed by the client.

Front End Analysis - Artifacts

The analysis of data produced during the analysis stage constitutes Front End Analysis. The artifacts produced as a result of front end analysis include information on our learners, the environmental considerations, the tasks and jobs involved, the delivery technology, the framework for an objectives hierarchy, the cross functional solution set, delivery, evaluation plan, framework for ROI calculations etc.

Design

We follow up the Analysis phase with a Design phase which starts with the Task Analysis stage, from which we derive the further stages of

  • deriving learning outcomes,
  • developing the content outlines and finally
  • the assessment of outcomes.

Once the Training Needs Assessment is completed from the previous Analysis Phase, there is a lot of clarity on the type of instruction (Modes, Methods, architecture) that will be used. Often there will be a mix of methods and architectures since different performance objectives may call for varying methodologies. Each type of instruction calls for specific types of learning material and varying treatments because different learner behaviours are required.

At this stage, we use interviews, questionnaires, read up existing documentation, analyse flagged quality issues, carry out Pareto analysis, etc. The end result is that we now obtain the Knowledge and Skill requirements for the given job if it is an existing one or build up the job spec for a new one from which Knowledge and Skill requirements are derived. Learner profiling gives us present knowledge and skills. We then, based on the gap between required knowledge and skills versus obtained knowledge and skills draw up what needs to be taught (content).

Once this is done, we begin to draw up (using an exemplar as a standard if such a performer is available) how the material is to be taught (the sequence outline) and the methods (case study, simulation, drill, practice etc.) and most importantly, what level of performance is required from our learners (objectives). Agreement is reached with the client (metrics derivation), on evaluation and assessment – we plan here on how the learners are to be tested and what constitutes competent performance. We then draw up the assessment schema which gives us a clear idea of the what, when and how’s of the instruction that will produce the required learning outcomes.

Design Analysis - Artifacts

The analysis of data obtained provide information on the schedule, the project team, the media specifications, the lesson structure, the program flow, the lesson flow, versioning, risk mitigation, and the detailed evaluation plan – to correspond with the agreed metrics.

Depending on whether SCORM/ AICC/ IMS/ IEEE compliance/ compatibility is required, additional elements are prepared. Once the documents are is approved, we move to the next stage viz Development and Implementation.

Development and Implementation

The Development phase consists of developing the actual instructional material that instructors will use and learners will interact with. The courseware is now actually built. Here e-learning storyboards are prepared (which incorporate various scenarios/ case studies/ situations/ linear content); Video (if required) is shot, edited and logged. Audio is recorded, edited, and logged. Graphics are created, edited and logged. Initial versions of web pages are developed, tested, and reviewed. Practice exercises are developed, content is formatted, interactivities are programmed and instructor and student workbooks are written. Assessments are incorporated and the assessment engine linked to deliver results through an appropriate API to the client’s LMS.

The outputs can be fully digital or a combination of print and digital mediums. All the artifacts of this phase are reviewed by the client and changes sought are addressed. This may be followed by a Pilot program where this is tested out on a live course and revisions carried out again.

An important aspect of development at KQ is the implementation of Review cycles. We follow a laid down procedure for this, where each reviewer clearly understands what to review and how to do it. Records of approvals, decisions and changes are scrupulously maintained. Each media element is integrated and coordinated and the unique project production and implementation requirements are managed.

Evaluation

When we speak of evaluation, we mean two different things by this. One is the evaluation of the program results which corresponds to what is popularly known as Level 1 to level 5 evaluation. The other is the evaluation of all the phases of the Analysis, Design, Development and Implementation phases. The Evaluation phase is linked to all stages and determines the quality of the training program. Though this is traditionally shown as a separate stage/ phase, our ISD methodology incorporates this throughout the ISD process. The Artifacts (whether the same is a deliverable or not) produced at every stage is evaluated to see if it meets program requirements, quality requirements (implicit or explicit), and serves as the basis for subsequent activity.

At KnowledgeQ, we place value on the performance of our products in the workplace. We always encourage our clients to give us objective metrics, based on which our performance is evaluated. We encourage our customers to use Kirkpatrick’s (and Phillips) levels of evaluation to determine courseware efficacy. As is well known, these levels are

  • Level 1 – Reaction: Measures learner’s response to the training program activity
  • Level 2 – Knowledge: Measures changes in knowledge, skills and attitudes as a result of the training program. Normally formative and summative assessments measure this
  • Level 3 – Performance: Measures change in behaviour and/ or attitude based primarily or wholly on being able to transfer skills learned on to the job situation
  • Level 4 – Impact: Measures the impact on the business
  • Level 5 – ROI: Measures return on investment (ROI) from the activity. It is expressed as a ratio of the value of returns to the costs of the program
Using Spiral/ Rapid Prototyping Methodology

We use Spiral/ Rapid Prototyping as the development model when requirements and therefore the courseware treatment methodology are not clear. While many of the documents and specifications used in the ADDIE methodology are still valid in Spiral/ Rapid Prototyping, the philosophical underpinnings and mode of development are very different. In what follows, we will try to bring out this essence, as we practice it at KnowledgeQ.

The Spiral/ Rapid Prototyping methodology consists of the following phases:

Basic Analysis, building Rough Prototype, building shared understanding through using learning designs and models, building learning units, integrating learning units, versioning and quality contol and gold delivery.

The Analysis stage in the ADDIE methodology uses a very thorough front end Analysis. In Spiral/ Rapid Prototyping, we do not carry out the Analysis stage in full, but just enough to have a basic understanding. This is often the case, because the problem itself may not be well defined, nor is the nature of the solution. The Client Project team also would have minimal facts in hand. Carrying out a full analysis would not be time effective. A spiral approach (with 2-3 turns of the spiral) to building the prototype is taken. The spiral methodology we have at KQ is philosophically different from that of Boehm where risk analysis was the prime motivator. Here the emphasis is on building prototypes to build a shared understanding (envisioning in agile terminology). The effort here is in moving this prototype closer to the user’s mental picture at the end of each circulation of the spiral.

Building and refining prototypes

Based on an understanding of the performance problem, as stated by the client, prototyping based on the information obtained - “basic analysis” is commenced (we do not do a rigorous analysis). It is important to note here that “basic analysis” does not correspond to the Analysis stage of the ADDIE model. Therefore at this stage, fixing the requirements or defining the scope of the project does not occur. The spiral model makes a very clear distinction between actual models and documents. The strength of the spiral model is flexibility. It understands that the user’s vision of the capabilities of the courseware and its possibilities would change as the product develops from a prototype. Based on this analysis and with inputs from recent learners, the design is commenced.

The design phase of the prototype now begins. Let us say, for example 50 objectives have been defined for the course. These are divided into say five learning units. Alternative Design solutions for all five learning units are now proposed and prototypes built based on the design used. The design here is not deep but at a very shallow level which merely targets the objectives, proposes a treatment, but is not anywhere close to the end product. Alternative learning designs are made so that users may choose between them. Sound learning paradigms back good modelling and the effect is to produce instructionally sound but rapidly produced prototypes. Crucial to producing rapid prototypes, is the development of new models at our R&D Centre.

Development of the prototype commences after design is complete. This is a quick and dirty job which involves putting down rough graphics, text and simple HTML5 animations and audio.

The essence of Spiral at KQ is to get these phases done quickly without bothering about completeness and visual quality. The functional prototype is now evaluated by the QC teams based on preliminary specifications and the prototype is delivered to the user.

User evaluation takes the form of feedback on various aspects. The user comments on the accuracy of the learning sequences and content, the realisation of the scenarios and choosing amongst various learning designs proposed for each prototype.

Sample learners also give feedback on how they understood the material taught, the aptness of the designs, the level of difficulty with each approach, the UI, Navigation etc. Once this is obtained, the Client and KQ Project Managers video conference to review the findings. In case there are disagreements on an issue – for example the UI, more interaction takes place to resolve it.

Based on the feedback from the client, the process now continues on the second turn of the spiral. The activities in the second spiral are similar to the first. The scoping and requirements are updated, based on feedback and additional inputs. Brainstorming by the teams commences and end in a design which is closer to what the users need. Development and Implementation is again a quick affair and the modified prototype goes back to the user and the process continues. At KQ, we have found that at least two rounds of the spiral are required before a prototype is finalised. The final prototype approval is a major step forward for the courseware.

It is important to note that these stages makes use of concepts very dear to the Agile methodologies in software development (especially in Crystal Clear) - that of reflective improvement, close or osmotic communication and focus. The seating of the team at KnowledgeQ also aids in the same. Team members sit in a closed formation. Additionally, a large, high definition screen is available which each member of the team can instantly access via a switcher. For example if a Visual artist wants to know whether a graphic would work, he can just ask his team who all sit in the same bay. The team members hardly have to turn their heads to be able to convey their opinion and then continue with their work. Such simple ergonomics have produced large productivity gains at KQ.

Building the Learning Units

The prototype having gone through a series of iterative spirals is signed off. Based on this sign-off and the iterative requirements analysis, a detailed requirements document is made. This sets the stage for the second part – viz the construction of the courseware. This takes place linearly with the difficult parts taken up first. We normally like doing the last unit first. This fills the ID team on gaps in the learning objectives. Depending on feedback, changes are incorporated in the learning units, go through QC checks and are sent for client review. This procedure is followed for alpha and beta deliveries. The feedback is integrated and learning unit construction is signed off. This leads to Gold delivery. The last part is the integration of the various learning units and testing them in the client LMS. Once user feedback is integrated, sign off occurs. The KQ Spiral continues the process by providing support throughout the lifetime of the product until maintenance is discontinued by the client.

Introduction

We have placed a flow chart below to understand the various stages in the spiral model. Based on the rough scoping, requirements need to be defined. The development of the Learning units now begins. This coupled with past historical data of productivity now enables more accurate effort estimation. This estimation is normally much more accurate than in an initial proposal. In any case, if it is a fixed bid contract, the effort will be within the effort placed in the proposal. However this estimate will also be subject to further requirement changes allowed by the Spiral model through a Change Control Board (CCB). Continuing with the flow chart - after approval of the prototype, we now move to brainstorming and analysis and design solutions.

Team Brainstorming

Team Brainstorming is very important in the spiral content development process. It is extensively used in both phases – Prototyping and in the second phase which is analysis, design, construction and integration of the learning units. Brainstorming helps in quick generation of many creative ideas. In content development more than in coding, this is indispensable because a variety of methodologies, designs and ideas are necessary to produce interesting courseware. Without creativity, the courseware quickly degenerates into dull page turners.

Normally, the responsibility of producing the learning method, scenarios, interactivities etc is on the Instructional Design Architect. This often led to a premium on ideas and little variety in the scenarios. In Spiral development at KQ, brainstorming is mandatory since it generates a lot of ideas and a variety of scenarios of which at least one could be developed fully.

Copyright © 2022 KnowledgeQ Interactive Consultancy Services Private Limited. All Rights Reserved.