Saturday, February 6, 2010

Measuring Training Effectiveness


Training is a critical component in any organization's strategy, but organizations don't always evaluate the business impact of a training program. Given the large expenditures for training in many organizations, it is important to develop business intelligence tools that will help companies improve the measurement of training effectiveness. These tools need to provide a methodology to measure, evaluate, and continuously improve training, as well as the organizational and technical infrastructure (systems) to implement the methodology. Cross-functional and reporting and learning analytics provide important connections between the measures of learning effectiveness offered by a learning management system (LMS) and the larger enterprise metrics that indicate whether learning is transferred and positively affects business results.

Business Performance Impact

Unless a training program exists simply for the sake of training, results should be measured and measurements should include business performance data, not just training data. Including selected metrics—such as sales, customer satisfaction, workplace safety, productivity and others—into a reporting strategy can help demonstrate where training has increased revenue or decreased costs. Measurements that consider performance improvements can provide a benchmark for training effectiveness. After implementing a training initiative or changing an existing program, an organization can observe and record a change in performance. To evaluate retention rates, there should be a lag between the training and these behavior measurements.

Many organizations are unable to evaluate their programs beyond the first two Kirkpatrick levels because they lack the tools to collect the data to make higher level evaluations. In part, LMSs, the most common repository for training data and mechanism to deliver training, make lower level evaluations easy but don't provide any tool for higher level evaluation. Most LMSs automatically will track and report information required for Level One and Level Two analyses.

Likewise, training programs can inexpensively and easily administer pre- and post-tests that evaluate learning results (Level Two). When evaluating changes in student behavior and training influence on business results, however, data collection requirements extend beyond course delivery.

Much of the data needed to bridge the gap between training and performance exists in many organizations. Individual performance data exists in performance management systems. Organizational data exists in marketing, sales, and financial systems. Bridging this gap requires a technical infrastructure and reporting strategy that minimizes the administrative effort needed to collect and analyze the training and performance data together.

But why are organizations still unlikely to evaluate training at Kirkpatrick's Level Three? System integration, one common point of failure, is critical. Many LMS vendors with a history as product companies have limited expertise in system integration that extends beyond learning systems and databases. Successfully managing performance-based training evaluation, however, requires expertise in data management and warehousing, a variety of corporate systems and databases, analytics, and Web-based application development.

Cross-Functional Reporting

A reporting and data management strategy that focuses on the LMS as the foundation only compounds the system integration challenges that make performance-based training evaluation unmanageable. Instead, the organization should adopt a cross-functional corporate reporting and data management strategy. The features of a cross-functional reporting system that also can provide learning and training analytics include:

•Independence from LMS
• Integration with business systems across the enterprise
• Alignment with individual and organizational performance

LMS Independence

A cross-functional reporting system for training should not be locked into a single LMS platform. By utilizing a generic framework, common LMS data should map to variables in the learning intelligence system. Typically, an organization will feed performance, job code, certification, and other corporate data into the LMS reporting system. By adding a cross-functional system between the LMS and other corporate systems, the organization only needs to update one data connection if the LMS changes.

Business System Integration

As a broker for business intelligence throughout the organization, the reporting system needs to aggregate the data from multiple corporate systems. If assembling information is too cumbersome and time consuming and the data is outdated or not even correct, the system cannot enhance evaluations by combining training with other business data.

If the organization has a corporate data warehouse, the LMS can push the learning management data into this consolidated data source. Any corporate reporting system then can access this learning data, combine it with other business data, and make advanced ROI calculations. Although integrating multiple data sources can require significant system integration effort, the organization gains greater control over its learning and business data. Different data owners maintain data integrity in the consolidated data source, which provides a unified data access point.

Business system integration allows the organization to leverage training and business data together in a context-sensitive manner. The cross-functional system can push data to an enterprise or departmental portal or a reporting tool used by a particular decision maker. Portal applications and other reporting tools allow training professionals to make more informed decisions when designing and implementing a training program. Some examples of how data can be combined for different decision-makers and purposes include:

• A training scorecard that evaluates training programs on ROI and other performance metrics. The training scorecard becomes a much more powerful tool to manage interdependent activities and performance if it provides an easy-to-use "drill-down" capability that provides supporting data, so training professionals can identify how cost and performance results contribute to a training program's score.

• Sales, manufacturing, distribution, customer service, and other scorecards that provide metrics specific to that domain, including training and other relevant operational and performance metrics.
• Predictive analytical tools that allow organizations to perform what-if scenarios to make resource allocation decisions that maximize desirable organizational performance.

Performance Alignment

What differentiates a cross-functional reporting strategy from most LMS-based reporting approaches is the ability to align training with performance objectives for the entire extended enterprise, including individuals, the organization, and its business partners. The cross-functional approach can combine the course completion, certification, and assessment scores of the LMS with the evaluation and competency data in a performance management system. Achieving this alignment depends on statistically validated learning analytics that help an organization understand how training, individual behavior, and organizational performance are linked.

A Training Model from Learning Analytics


When an organization measures without an understanding of interdependent cause and effect relationships, it does not accurately evaluate training effectiveness. A company may achieve better sales numbers following a sales training initiative, even if the training itself was deficient. Tracking results does not necessarily evaluate how training modified sales staff behavior or ability.

Learning analytics based on a statistical analysis provides the necessary—and often missing—basis to quantify how different training and non-training activities affect performance. For example, the National Association of Secondary School Principals used statistical methods to investigate the correlation between school cultures and student achievement as measured by SAT scores. A task force collected data from 81 schools during the 2002-'03 school year using survey methodology. As a result of the statistical analysis, the task force could not only identify how a factor affected SAT scores but to what degree the factor affected performance. It also captured the interdependencies among factors. (An increase in administrative performance of x increases teacher climate by y, which increases scores by z, at the same time that a dress code reduces teacher climate by a, which reduces scores by b—see figure at end of article.

For an organization that wants to apply these techniques, historical data provides a good starting point to identify what aspects of different training programs had the greatest impact on individual and organizational performance. After developing this initial model, the organization can invest in training to achieve desired performance results.

Conclusion


A robust cross-functional reporting strategy and statistical methodologies can support continuous improvement of learning, just as they do in other activities, such as manufacturing. By selecting those measurements that can support valid inferences about the effectiveness of programs, learning and training professionals can know where to improve and how to allocate resources and effort—essentially improving every training program’s influence on business results.




A statistical analysis developed from structured equation models and secondary school data. Note how the map reveals relationships, where a factor such as dress code negatively impacts student achievement that may seem counterintuitive or contrary.

No comments:

Post a Comment