Tuesday, November 3, 2009

2009 Fall Report of the Auditor General of Canada .

Chapter 1—Evaluating the Effectiveness of Programs
Main Points
Introduction
Focus of the audit
Observations and Recommendations
Meeting needs for effectiveness evaluation
The need for effectiveness evaluations continues to grow
Departments have systematic processes to plan effectiveness evaluation
Expenditure management needs were not adequately considered
Departments complete most planned evaluations
Evaluation coverage of programs is limited
Evaluations do not adequately assess effectiveness
Some quality assurance processes are in place
Departments are not systematically identifying priorities for improvement
Capacity for effectiveness evaluation
Funding increases have enhanced departments’ capacity for evaluation
Shortage of experienced evaluators continues
Evaluator competencies are not adequately defined
Other responsibilities of evaluation units put pressure on capacity
Departments use contractors extensively
Oversight and support
The Secretariat has identified a number of improvements in evaluation
The Secretariat carried out extensive monitoring of the evaluation function
Sustained support for effectiveness evaluation is lacking
Oversight and support require experienced staff
Care is needed in the implementation of the new coverage requirements
Conclusion
About the Audit
Appendix—List of recommendations
Exhibits:
1.1—The history of federal program evaluation reveals many audit observations that are critical of government initiatives
1.2—Government-wide requirements for effectiveness evaluation appear in both statutes and policies
1.3—Most of the planned evaluations were completed
1.4—A low proportion of total program expenses were evaluated between the 2004–05 and 2007–08 fiscal years
1.5—Some effectiveness evaluations have insufficient performance information
1.6—Evaluation unit funding increased in most departments
1.7—The estimated number of evaluation unit professional staff increased in each department between the 2004–05 and 2008–09 fiscal years
1.8—Evaluations conducted by contractors in whole or in part in the audited departments
Main Points
What we examined
Effectiveness evaluation is an established tool that uses systematic research methods drawn from many different disciplines to assess how well a program is achieving its objectives. When done well, it allows departments to develop evidence to determine how well their programs meet expectations, and whether they are cost-effective. Over the past 40 years, the federal government has made efforts to embed the management practice of evaluating program effectiveness, as an essential part of its support for program evaluation.
The 2006 Federal Accountability Act enacted into law a requirement that all grant and contribution programs be evaluated every five years. The new Policy on Evaluation that came into effect in 2009 extends the requirement for evaluation to cover all direct program spending over a five-year cycle.
We examined how evaluation units in six departments identify and respond to the various needs for effectiveness evaluations. We also looked at whether they have built the required capacity to respond to those needs. In addition, we looked at the oversight and support role of the Treasury Board of Canada Secretariat in monitoring and improving the evaluation function in the government, specifically with respect to effectiveness evaluations. The period covered by our audit was 2004 to 2009.
Why it’s important
Governments are under continual pressure to spend money on a range of programs designed to serve particular needs of society. While many factors affect the decisions that governments must ultimately make about programs, effectiveness evaluations can aid their decision making by providing objective and reliable information that helps identify programs that are working as intended; those that are no longer needed; and those that are not accomplishing the desired objectives and could be replaced by programs that will achieve the objectives more cost-effectively. In addition, effectiveness evaluation is expected to serve the information needs of parliamentarians.
One of the most important benefits of effectiveness evaluation is to help departments and agencies improve the extent to which their programs achieve their objectives. Departments need to demonstrate to Parliament and taxpayers that they are delivering results for Canadians with the money entrusted to them.
What we found
The six departments we examined followed systematic processes to plan their effectiveness evaluations and completed most of the evaluations they had planned. However, over the audited period, each department’s evaluations covered a relatively low proportion of its total program expenses—between five and thirteen percent annually across the six departments.
In effect, the rate of coverage was even lower because many of the effectiveness evaluations we reviewed did not adequately assess program effectiveness. Often departments have not gathered the performance information needed to evaluate whether programs are effective. Of the 23 evaluation reports we reviewed, 17 noted that the analysis was hampered by inadequate data, limiting the assessment of program effectiveness.
The departments we examined told us that it remains a challenge to find experienced evaluators, and they have made extensive use of contractors to meet requirements. Departments expressed concern about their capacity to start in 2013 to evaluate all direct program spending, as required by the 2009 Policy on Evaluation. To ensure full coverage (which includes grants and contributions), they will have to evaluate an average of 20 percent of their direct program spending each year of the five-year cycle.
The Treasury Board of Canada Secretariat introduced initiatives over the past few years to address the need for improvements in evaluation across the government. However, it did not provide sustained support for effectiveness evaluation. In particular, it made little progress on developing tools to assist departments with the long-standing problem of a lack of sufficient data for evaluating program effectiveness.
With the exception of Environment Canada, which has processes in place to identify needed improvements, the audited departments do not regularly identify and address weaknesses in effectiveness evaluation.
The Secretariat and the departments have responded. The Treasury Board of Canada Secretariat and the departments agree with our recommendations. Their detailed responses follow each recommendation throughout the chapter.
Introduction
1.1 The 2009 Treasury Board Policy on Evaluation defines evaluation as “the systematic collection and analysis of evidence on the outcomes of programs to make judgments about their relevance, performance and alternative ways to deliver programs or to achieve the same results.” The objective of this policy is to create a comprehensive and reliable base of evaluation evidence that is used to support policy and program improvement, expenditure management, Cabinet decision making, and public reporting.
1.2 Program evaluation is recognized as a key source of information on program effectiveness. This information is essential if senior officials are to base program and funding decisions on evidence of program effectiveness. Potential key users of evaluation findings include senior department officials, the managers of programs being evaluated, central agencies, and parliamentarians. The findings of program evaluation may also be of interest to program stakeholders and to the public.
1.3 Program evaluation has been practised in the federal government, in one form or another, for close to 40 years. The Office of the Auditor General examined program evaluation in 1978, 1983, 1986, 1993, 1996, and 2000. The history of federal program evaluation from 1970 to 2000 reveals repeated initiatives at the centre of government to establish and support the function, and often critical observations by the Office of the Auditor General on the success of these efforts (Exhibit 1.1).
Exhibit 1.1—The history of federal program evaluation reveals many audit observations that are critical of government initiatives
Year
Event
1970
The Treasury Board of Canada Secretariat established a Planning Branch to carry out interdepartmental studies of policy programs.
1977
The Treasury Board issued the first evaluation policy, which required departments to subject each program to effectiveness evaluation on a regular basis.
1978
The Auditor General’s Study of Procedures in Cost-Effectiveness reviewed 23 programs in 18 departments, and found few successful attempts to evaluate their effectiveness.
1981
The Office of the Comptroller General developed a policy framework to guide and structure departmental evaluation functions, and established a team of liaison officers to give guidance and advice to departments in its implementation.
1983
The Auditor General’s report found that although very real progress had been made in developing the program evaluation function since its 1978 study, few high-quality evaluations were being done.
1986
The Program Review carried out by the Nielsen Task Force found that the information received from program evaluations was “generally useless and inadequate.”
1986
The Auditor General’s follow-up to the 1983 audit concluded that there had been an improvement in the quality of methodology and reporting of program evaluation, but that problems regarding balance and full disclosure of limitations in methods still existed.
1993
The Auditor General found that “high expectations and great potential” had been only partly fulfilled: “The system’s results are often disappointing. Program evaluations frequently were not timely or relevant. Many large-expenditure programs had not been evaluated under the policy ...”
1996
The Auditor General’s follow-up to the 1993 audit found that little progress had been made to address effectiveness issues.
2000
The Auditor General found that the evaluation function had actually regressed, due in part to reductions in funding that undermined capacity.
1.4 The advent of results-based management. In 2000, the Treasury Board of Canada Secretariat issued Results for Canadians: A Management Framework for the Government of Canada. It outlined a results-based management approach, which states that
departments and agencies need to implement an information regime that measures, evaluates and reports on key aspects of programs and their performance in core areas; holds managers accountable for achieving results; and ensures unbiased analysis, showing both good and bad performance.
1.5 As we noted in our report from the same year (the December 2000 Report of the Auditor General of Canada, Chapter 20—Managing Departments for Results and Managing Horizontal Issues for Results), evaluation has an important role to play in managing for results. It can provide important information on program performance that is not gathered by ongoing monitoring systems, and can help managers understand why programs are working or not.
1.6 In 2001, the Treasury Board issued an evaluation policy, intended to reflect the results-based management philosophy expressed in Results for Canadians. The objective of this policy was to ensure that the government has information on the performance of its policies, programs, and initiatives that is timely, strategically focused, objective, and evidence-based. The Policy described evaluation as a management tool for the periodic assessment of a program’s effectiveness in achieving objectives, of its impact and relevance, and of alternative ways to achieve expected results (cost effectiveness).
1.7 In 2003, the Treasury Board of Canada Secretariat introduced the Management Accountability Framework, to assess management performance in departments, including program evaluation. Subsequent guidance that was issued in 2007 to support the 2005 Policy on Management, Resources and Results Structure (MRRS), encouraged department heads of evaluation to provide advice on the Performance Measurement Framework embedded in their organization’s MRRS. The Secretariat views the Framework and the policy requirements as an integrated approach to measuring management performance in departments.
1.8 Focus on expenditure management. More recent developments have placed renewed emphasis on the expenditure management role of program evaluation. As described in its 2007 and 2008 Budget documents, the federal government has introduced a new expenditure management system, a key pillar of which is the ongoing assessment of all direct program spending, known as strategic reviews. Strategic reviews are intended to ensure that programs are effective and efficient, meet the priorities of Canadians, and are aligned with core federal responsibilities. According to the Treasury Board of Canada Secretariat, program evaluation is a key source of information on program effectiveness in support of these reviews.
1.9 The 2009 Policy on Evaluation. The most recent development in federal program evaluation is a new policy that was approved in April 2009. In addition to supporting the renewal of the Expenditure Management System, by improving the information base for strategic reviews of departmental spending, the new policy requires that evaluations cover all direct program spending over a five-year cycle. Each year, departments will have to evaluate an average of 20 percent of their direct program spending to ensure full evaluation coverage, which includes grants and contributions programs. Larger departments and agencies must implement this requirement from 1 April 2013. Following a four-year transition period, the first cycle covers the 2013–14 to 2017–18 fiscal years. Smaller organizations, with fewer than 500 full-time equivalents and a reference level of less than $300 million, are subject to the lesser requirement of ensuring coverage as appropriate. Accompanying the new policy is a directive on evaluation that defines the roles and responsibilities of department officials involved in evaluation, and standards that set minimum requirements for the quality, neutrality, and usefulness of evaluations.
Focus of the audit
1.10 This was a government-wide audit that examined program evaluation in relation to the measurement of program effectiveness—effectiveness evaluation. The overall objective of this audit was to determine whether selected departments and the Treasury Board of Canada Secretariat are meeting the needs for effectiveness evaluation and are identifying and making improvements in effectiveness evaluation.
1.11 The audit covered a five-year period, from the 2004–05 to 2008–09 fiscal years. Focusing on this period allowed us to examine the impact of Secretariat efforts, through a series of its studies, to understand how well the 2001 evaluation policy was working up to the 2004–05 fiscal year. We also examined selected departments’ efforts to address any problems identified over the same period. Because the 2001 evaluation policy was replaced in April 2009, the audit did not focus directly on compliance with either policy.
1.12 Detailed examination work was carried out in the Treasury Board of Canada Secretariat and in six departments:
Agriculture and Agri-food Canada,
Canadian Heritage,
Citizenship and Immigration Canada,
Environment Canada,
Fisheries and Oceans Canada, and
Human Resources and Skills Development Canada.
1.13 More details on the audit objectives, scope, approach, and criteria are in About the Audit at the end of this chapter.
Observations and Recommendations
Meeting needs for effectiveness evaluation
1.14 Well-managed organizations operate according to a management cycle for continuous improvement consisting of planning, doing, checking, and improving. We looked for evidence that the audited departments planned their effectiveness evaluations to meet identified needs, executed these plans, checked to see that needs were met, and made improvements where required. We followed a similar approach in relation to the Treasury Board of Canada Secretariat’s oversight and support role (paragraphs 1.63–1.93).
1.15 We found that, in light of their limited evaluation coverage and reliance on insufficient performance information, departments were not able to demonstrate that they are fully meeting needs for effectiveness evaluation.
The need for effectiveness evaluations continues to grow
1.16 When departments plan their effectiveness evaluation work, they must first identify the needs for these evaluations. Heads of evaluation in the six departments told us that they based those needs on both government-wide requirements and internal corporate risk-based needs.
1.17 The government views evaluation as the primary source of neutral and systematic information on the ongoing relevance and performance of policies and programs. It also expects evaluation to show alternative ways of achieving expected results and program design improvements. Government-wide requirements for effectiveness evaluations are defined both in statute and in policy (Exhibit 1.2).
Exhibit 1.2—Government-wide requirements for effectiveness evaluation appear in both statutes and policies
The 2000 Policy on Transfer Payments (revised in 2008) required departments to review and report on the effectiveness of transfer payments when requesting renewal of program terms and conditions. The Financial Administration Act (amended in 2006) requires departments to conduct a review every five years of the relevance and effectiveness of each ongoing program of grants or contributions for which they are responsible.
The 2001 Treasury Board Evaluation Policy advised that evaluations should consider the relevance, success, and cost-effectiveness of programs.
Approved program funding may be subject to specific conditions in Treasury Board submissions and memoranda to Cabinet requiring the department to evaluate the program and report back to the Treasury Board or Cabinet before a set deadline.
In the 2007 Budget, the Government announced its new Expenditure Management System (EMS), including enhanced requirements for evaluation.
As of April 2007, evaluation of effectiveness is required for selected new federal regulations.
The 2009 Treasury Board Policy on Evaluation requires that all evaluations that are intended to count toward coverage requirements address value for money by including clear and valid conclusions about the relevance and performance of programs.
1.18 According to the Treasury Board of Canada Secretariat, a critical need for effectiveness evaluation arises from the strategic review process in the Expenditure Management System. Organizations are required to conduct strategic reviews of direct program spending every four years to ensure that the government is directing its resources to the highest priority requirements.
1.19 In addition to these government-wide requirements, heads of evaluation identified internal corporate risk-based needs for effectiveness evaluations that respond to departmental priorities and to the priorities of government. These priorities are typically defined through consultation with senior departmental management.
1.20 Between 2004 and 2009, the need for effectiveness evaluations had grown. This was largely due to measures the government brought forward under the Federal Accountability Act in 2006. This Act introduced an amendment to the Financial Administration Act requiring that, every five years, all ongoing non-statutory grant and contribution programs be evaluated or reviewed for relevance and effectiveness.
Departments have systematic processes to plan effectiveness evaluation
1.21 In light of the many needs for effectiveness evaluation and their potential importance in informing decision making, it is critical that departments consider these needs when making program evaluation plans.
1.22 We expected program evaluation plans in the six departments to consider both government-wide requirements and corporate risk-based needs for effectiveness evaluation. We found that all departments had developed risk-based plans during the audited period (some started in the 2004–05 fiscal year and others started in 2005–06). Most of these plans are produced annually, and most of the recent ones cover a five-year period. We found evidence that, to identify corporate priorities, each department consulted its senior management and its various program areas when developing its evaluation plans. We also found that all departments—except Agriculture and Agri-Food Canada and Citizenship and Immigration Canada—had processes in place to identify and track requirements for grant and contribution renewals in Treasury Board submissions.
Expenditure management needs were not adequately considered
1.23 Officials in the six departments told us that the need for strategic review in the Expenditure Management System had not been a key consideration during previous years’ evaluation planning. However, they told us that this need is now being considered when preparing their evaluation plans. To date, of the six departments audited, only Canadian Heritage (in 2007) and Agriculture and Agri-Food Canada (in 2008) have completed a strategic review. Officials at Fisheries and Oceans Canada and at Canadian Heritage informed us that they plan to undertake evaluations in preparation for the next strategic review.
Departments complete most planned evaluations
1.24 To meet the needs for effectiveness evaluations, we expected the six departments to have conducted the evaluations that they had identified in their plans. We found that most of the evaluations planned by the six departments were carried out (Exhibit 1.3). Department officials told us that some evaluations were not completed for a variety of reasons, including program cancellations and redesign, limited data availability, and changes in internal client needs (for example, senior management and program managers) and evaluation capacity. Officials at Human Resources and Skills Development Canada informed us that, because many of the evaluations not completed during the audit were multi-year evaluations with planned completion dates beyond our audit period, they projected a higher completion rate.
Exhibit 1.3—Most of the planned evaluations were completed
Department
Number of planned evaluations*
Number and percentage completed **
Canadian Heritage
48
42 (88%)
Fisheries and Oceans Canada
30
24 (80%)
Environment Canada
29
23 (79%)
Citizenship and Immigration Canada
21
15 (71%)
Agriculture and Agri-Food Canada
25
14 (56%)
Human Resources and Skills Development Canada
64
34 (53%)
* Based on departmental evaluation plans between the 2004–05 and 2007–08 fiscal years.
** Based on listings of evaluations provided by departments in their 2004–05 to 2007–08 plans, which were completed by 30 April 2009.
Evaluation coverage of programs is limited
1.25 While the 2009 Policy on Evaluation requires departments to evaluate all direct program spending, there was no such requirement during the period of our audit. The Treasury Board of Canada Secretariat required departments to conduct evaluations based on risk. We noted that the Secretariat encouraged departments to embed program evaluations into program management and ensure adequate evaluation coverage of programs. The six departments followed systematic processes when they planned evaluations, including effectiveness evaluations. However, within the departments we found that a low proportion of total program expenses were evaluated.
1.26 We examined the data that the six departments provided to the Treasury Board of Canada Secretariat as part of the Annual Capacity Assessment Survey, including information on the annual expenses of evaluated programs. We compared these program expenses to total departmental program expenses for that year. We found wide variation across the six departments in their evaluation coverage of their program expenses (Exhibit 1.4).
Exhibit 1.4—A low proportion of total program expenses were evaluated between the 2004–05 and 2007–08 fiscal years
Department
Estimated average annual percentage of program expenses evaluated
Canadian Heritage
13%
Agriculture and Agri-Food Canada*
11%
Environment Canada
9%
Fisheries and Oceans Canada
8%
Citizenship and Immigration Canada
6%
Human Resources and Skills Development Canada*
5%
* Agriculture and Agri-Food Canada and Human Resources and Skills Development Canada each carried out evaluation work that included several programs. This has increased their coverage.
Sources: Treasury Board of Canada Secretariat Annual Capacity Assessment Survey, descriptive information provided by departments, and Public Accounts of Canada
1.27 Our calculations showed that Canadian Heritage had the highest percentage of program expenses that were evaluated compared with the other audited departments. Notably Canadian Heritage also has the highest percentage of expenses on grant and contribution programs, which amounted to 80 percent of its total expenses in the 2007–08 fiscal year. According to the requirement under the Financial Administration Act, departments have until the 2011–12 fiscal year to evaluate all of their ongoing, non-statutory grant and contribution programs that were in existence in December 2006.
1.28 As well, the requirements for renewals of grants and contributions occur in a concentrated period. For example, officials at Citizenship and Immigration Canada told us that it is challenging for them to meet their evaluation requirements because the majority of grant and contribution programs come up for renewal at the same time, creating a spike in the evaluation unit’s workload. However, the Department indicated that these requirements were met over the period of our audit. We noted that, in the 2009–10 fiscal year, its evaluation unit plans to evaluate about 90 percent of its grant and contribution expenses, leaving little capacity to respond to other needs.
Evaluations do not adequately assess effectiveness
1.29 The quality of evaluation methods is a long-standing concern in the federal government. In 2005, the Treasury Board of Canada Secretariat reviewed the quality of evaluation reports to determine whether they had improved. The review noted that, while evaluation reports had improved in quality since 2002, there remained a “pressing need for further improvement.”
1.30 Evaluation practitioners employ a range of methods. There are no universally accepted minimum standards for effectiveness evaluation. However, both the 2001 and 2009 Treasury Board policies on evaluation refer to the need to collect reliable data to support evaluation findings. The 2001 policy refers to objective data collection and analysis, while the 2009 Standard on Evaluation requires that evaluations be based on “multiple lines of evidence, including evidence produced from both quantitative and qualitative analysis.”
1.31 Both policies recognize that the level of methodological rigour should reflect the intended use of the findings. In addition, in programs with low materiality, less rigorous methods may be appropriate.
1.32 In 2005, the Treasury Board of Canada Secretariat found that evaluations tended to rely mainly on information from stakeholder interviews, file and document reviews, case studies, and surveys. The Secretariat also found that information from ongoing program performance measurement tended to be unavailable or insufficient to effectively support evaluations. According to the Secretariat, this was because few programs had reliable systems for collecting and reporting this information. The Secretariat’s review concluded that the lack of program performance data reduces the overall quality of evaluation reports.
1.33 When data is not available, evaluators may have to collect it themselves, or find alternative sources for the information, and their ability to apply the appropriate methods may be constrained. Using a variety of methods, both quantitative and qualitative, is important to ensure that evaluation methods generate enough reliable evidence. Unreliable data reduces confidence in the usefulness of evaluation measurements.
1.34 We reviewed a sample of 23 evaluations identified by the six departments as addressing effectiveness to determine the types of data collected to support evaluation findings. We found that, of the 23 evaluations, 17 explicitly stated that program performance information was lacking because data was unavailable or was not sufficiently reliable. As a result, in 9 of 17 cases, the evaluations indicated that they were limited in their assessment of program success and effectiveness. Furthermore, in 6 of the 17 cases, the assessment was primarily based on interviews with program staff and stakeholders (Exhibit 1.5).
Exhibit 1.5—Some effectiveness evaluations have insufficient performance information
Agriculture and Agri-Food Canada. The Department’s Prairie Grain Roads Program operated between April 2001 and March 2006 with an average annual budget of $35 million. The program was meant to assist Prairie provinces to address increased pressure on rural roads by improving roads, increasing truck tonnage capacity, and also increasing safety for road users.
Completed in 2006, this $103,600 evaluation was to determine the program’s results and impacts, adequacy of design, continued relevance, and cost-effectiveness. Achieving these objectives was hindered by the program’s lack of performance measures to determine its effectiveness. It was not possible to know, for example, whether road safety had improved as a result of the program.
Claims of improved safety were primarily based on interviews and satisfaction ratings from successful program applicants. If performance measures had been implemented from the outset of the program, more complete conclusions on the effectiveness of this program could have been reached.
Citizenship and Immigration Canada. The Department’s Private Sponsorship of Refugees Program, which began in 1978, is intended to assist refugees to settle and build new lives in Canada through sponsorship by Canadian citizens. The annual budget of the program is approximately $5 million.
The most recent evaluation of this program was done in 2007 at a cost of $268,000. The evaluation was intended to examine the program’s continued relevance, success in achieving outcomes as identified in its results-based management accountability framework, and its cost-effectiveness.
This evaluation used data originating from a wide range of sources, including statistical information from both the department and Statistics Canada. As a result, the evaluation was able to more objectively document the extent to which refugees were finding employment, accommodation, and essential services.
Fisheries and Oceans Canada. In 2006, the Department completed an evaluation of its Program for Sustainable Aquaculture. The program was launched in 2000; it had an annual budget in the 2008–09 fiscal year of $15 million. The objective of this program is to foster growth of a sustainable and competitive aquaculture industry and to increase public confidence in aquaculture.
The stated purpose of this $98,600 evaluation was to examine the relevance, success, and cost-effectiveness of the program. However, the evaluators acknowledged that there was no ongoing monitoring system to track program results. For example, because there was limited information on the impact of aquaculture on human health, it was not possible to determine whether related program objectives were achieved.
Without this information, many of the evaluation’s conclusions were based on interviews with federal officials and industry representatives, as well as on document and file reviews. As a result, this evaluation did not adequately address the effectiveness of the program.
1.35 The heads of evaluation in all six departments confirmed that performance information they needed to evaluate whether programs are cost-effective and are achieving expected results was often insufficient. The development and implementation of ongoing performance measures is the responsibility of program managers, not departmental evaluation units.
1.36 Due to the weaknesses in performance information and the need to apply appropriate evaluation methods, the actual coverage of departmental programs by effectiveness evaluations is even more limited than shown in this audit (Exhibit 1.4). Based on our sample, three quarters of the evaluations were hampered in their assessment of program effectiveness because of inadequate data.
1.37 Recommendation. Agriculture and Agri-food Canada, Canadian Heritage, Citizenship and Immigration Canada, Environment Canada, Fisheries and Oceans Canada, and Human Resources and Skills Development Canada should develop and implement action plans to ensure that ongoing program performance information is collected to support effectiveness evaluation.
Agriculture and Agri-Food Canada’s response. Agreed. The Department agrees that the systematic collection of program performance data by managers is necessary to report on program performance and to support effectiveness evaluations.
As required by the Treasury Board Policy on Transfer Payments, a performance measurement strategy for ongoing management of transfer payment programs, including performance measures and indicators and a data collection strategy, is developed for each new transfer payment program.
The Department’s evaluation function reviews program performance measurement strategies as they are developed to ensure that outcomes are defined, measurable, and attributable and that the strategies, if implemented, are sufficient to support future evaluation work.
Beginning in the 2009–10 fiscal year, the Department will conduct annual state of performance measurement reviews. The first such review will assess performance measurement practices at the Department and the adequacy of data collected for programs soon to be evaluated. An action plan to address its recommendations will be developed and its implementation monitored with a view to strengthening the Department’s practices in this area.
Canadian Heritage’s response. Agreed. The Department is executing its Action Plan for Implementation of the Management, Resources and Results Structures (MRRS) Policy to ensure that program staff are able to fulfill their responsibility for developing and maintaining performance measurement strategies. Action plan measures include
the provision of information sessions and workshops,
the establishment of indicators and corresponding targets,
the development of robust methodologies to demonstrate outcomes,
the establishment of a centre of expertise on performance measurement within the Department,
the design and implementation of adequate tools and guidelines,
the establishment of relevant information technology and systems, and
the regular analysis and reporting of collected data.
This action plan is expected to be completed by end of the 2011–12 fiscal year.
The Department’s Office of the Chief Audit and Evaluation Executive is continually providing advice and support to Department managers in their efforts to implement this action plan. In line with the Transfer Payment Policy, this office is also providing timely advice and support on program design and performance measurement strategies through the review of official approval documents for the creation and renewal of new or existing programs. Finally, as required under the new Evaluation Policy, the office will be submitting, in the 2010–11 fiscal year, its first annual report to the Departmental Evaluation Committee on the state of performance measurement in the Department.
Citizenship and Immigration Canada’s response. Agreed. The Department recognizes the value of ongoing performance information for evaluation and will continue to support related departmental activities. The Department will develop an action plan that includes the renewal of the comprehensive departmental performance measurement framework and program activity architecture, and furthers the integration of the Framework into the business planning process.
Environment Canada’s response. Agreed. The Department accepts this recommendation, which echoes the intent of the 2009 Evaluation Policy related to performance information. As such, actions are already under way within Environment Canada to implement this recommendation. These include ongoing monitoring of the implementation of management responses to previous evaluations that have identified concerns with performance information, and the development and implementation of a strategy to inform all department managers of the Evaluation Policy requirements pertaining to performance measurement.
In addition, the Department’s evaluation plan will be expanded to include a monitoring component to verify, within available resources, the status of performance data collected in the department and whether sufficient performance information will be available to support upcoming evaluations. This monitoring component will be included in the 2010–15 Evaluation Plan and will be updated annually thereafter.
Further, for the 2010–11 fiscal year, the Department’s performance measurement framework has been linked to the Department’s program activity architecture, in that performance measures have been identified for all programs.
Fisheries and Oceans Canada’s response. Agreed. The Department’s performance measurement framework links its core indicators to the departmental program activity architecture (PAA), thus identifying performance measures for all program activities and sub-activities. Each fiscal year, the Department conducts an analysis of the state of performance measurement in the Department and provides an annual report to the Departmental Evaluation Committee. In addition, the Department will develop and implement an action plan to ensure that ongoing program performance information is collected to support effectiveness evaluation by the end of August 2010.
Human Resources and Skills Development Canada’s response. Agreed. The Department accepts this recommendation that echoes the intent of various Treasury Board of Canada Secretariat policies, including the 2009 Evaluation Policy. As such, the Department already gathers and monitors ongoing performance information to support effectiveness evaluation, including
monitoring implementation of management responses to previous evaluations that have identified data issues for current programs;
undertaking early evaluative work in advance of the formal initiation of effectiveness evaluations as part of the evaluation planning process, to review the state of performance data and logic models; and
monitoring the implementation of new programs (for example, Economic Action Plan initiatives) to ensure the necessary administrative and performance data are available to support future evaluation activities.
The Department is also undertaking a comprehensive review, refinement, and validation of its Performance Measurement Framework (PMF) to make it more robust and comprehensive, to support ongoing planning, monitoring, and managing for results and the Evaluation Directorate has been actively involved in this work. The progress made on the departmental PMF will support both performance monitoring as well as evaluation of relevance and effectiveness.
Some quality assurance processes are in place
1.38 Quality assurance processes are designed to ensure that quality requirements are being met. In the case of effectiveness evaluation, quality assurance helps to ensure that reports meet defined standards and provide decision makers with reliable and useful evaluation findings.
1.39 We expected the six departments to demonstrate that they had developed systematic quality assurance processes for their effectiveness evaluations. We found that all six departments had quality assurance processes in place. While these processes were not identical in design, we identified a number of common elements: internal evaluation standards, internal review, and external expert review. All six departments did internal reviews and all but two (Agriculture and Agri-Food Canada and Fisheries and Oceans Canada) used external experts, either routinely or on a selective basis, to review for quality assurance. Despite these quality assurance processes, we found that three quarters of the evaluations in our sample failed to adequately address effectiveness.
1.40 Quality assurance processes may also support the independence of the evaluation unit from program management. Through its 2001 and 2009 evaluation policies, the Treasury Board acknowledged that having a departmental evaluation committee approve evaluation plans and reports supports independence, because these plans and reports are reviewed objectively by department officials who are not program managers.
1.41 We examined the processes that the six departments followed to review and approve evaluation plans and reports. In all six departments, both plans and reports are approved by senior evaluation committees.
1.42 We note that, unlike evaluation committees, the departments’ audit committees are required to have external members. In our view, this is a good practice, because the knowledge and perspectives of practitioners from outside government are being considered. This practice could also have merit for evaluation committees. Under the 2009 Policy on Evaluation, evaluation committees are required to review evaluation plans and reports and recommend their approval by the deputy head. Committees with external members could play a stronger role in the continuous improvement of effectiveness evaluation.
1.43 Recommendation. Agriculture and Agri-food Canada, Canadian Heritage, Citizenship and Immigration Canada, Environment Canada, Fisheries and Oceans Canada, and Human Resources and Skills Development Canada should consider the merits of including external experts on their departmental evaluation committees. The Treasury Board of Canada Secretariat should provide guidance to departments in this regard.
Agriculture and Agri-Food Canada’s response. Agreed. The Department has recently introduced a number of practices to ensure production of strong evaluation reports. They include seeking input from external experts on evaluations in progress. The Department will also consider including external members on its Departmental Evaluation Committee, in the context of guidance provided by Treasury Board of Canada Secretariat.
Canadian Heritage’s response. Agreed. At the Department, oversight of the evaluation function is provided by the Strategic Policy, Planning and Evaluation Committee (SPPEC), chaired by the Deputy Head. This structure provides opportunities for enhanced integration between the policy, planning, and evaluation functions of the Department. As well, the Department already brings key evaluation reports when necessary to its department audit committee for review and discussion. The Department will collaborate with the Treasury Board of Canada Secretariat’s Centre of Excellence in Evaluation to assess the value added of integrating the advice of external evaluation experts to inform the work of departmental evaluation committees.
Citizenship and Immigration Canada’s response. Agreed. The Department will assess the benefits of including an external evaluation expert on the departmental evaluation committee.
Environment Canada’s response. Agreed. The Department accepts the recommendation and will await guidance from the Treasury Board of Canada Secretariat on the inclusion of external members on departmental evaluation committees.
Fisheries and Oceans Canada’s response. Agreed. The Department will consider the merit of including external members on its department evaluation committee, in the context of guidance provided by the Treasury Board of Canada Secretariat.
Human Resources and Skills Development Canada’s response. Agreed. The Department has, in the past, included members from outside the Department on its departmental evaluation committee. The Department will reconsider the formal inclusion of external experts on the current Departmental Evaluation Committee and will look to the Treasury Board of Canada Secretariat for guidance on this part of the Evaluation Policy.
The Treasury Board of Canada Secretariat’s response. Agreed. The Secretariat agrees that it should provide guidance to departments on the possible merits of including external experts on their departmental evaluation committees. These actions will be completed by 31 March 2010.
Departments are not systematically identifying priorities for improvement
1.44 The Treasury Board of Canada Secretariat carries out annual assessments of department management using indicators that measure each of the 10 elements of the Management Accountability Framework. Among these elements are assessments of evaluation coverage, quality, and use. According to the Secretariat, these assessments help deputy heads identify priorities for management improvement. In addition to these assessments, we expected the six departments to have their own internal processes for determining whether they are meeting needs for effectiveness evaluations.
1.45 As noted earlier (paragraph 1.22), we found that all departments have consultation processes aimed at ensuring that their evaluation plans reflect corporate priorities. In addition, we did see other improvements in some departments, although these were not systematic in nature. Only Environment Canada has a formal process in place to systematically identify aspects of its evaluation practice that require improvement. Environment Canada does a number of things to help ensure that its evaluation practice is oriented toward continuous improvement. For example, client feedback is solicited through post-evaluation surveys that provide ongoing feedback on the quality and value of evaluations.
1.46 In addition, when Environment Canada’s evaluations are completed, lessons-learned exercises are often developed and shared with managers of similar programs and initiatives to enhance the overall utility of evaluation findings. The evaluation unit also developed a self-assessment framework for quality assurance that is based on a self-assessment guide for internal audit. The unit adapted some elements in the guide to apply them to evaluation and it also considered existing standards for evaluation. The results of these activities are communicated to the evaluation committee.
1.47 With the exception of Environment Canada, the audited departments could not demonstrate that they have internal processes in place to systematically identify areas for improvement in effectiveness evaluation over the 2004–05 to 2008–09 period. Such a process enables departments to ensure that effectiveness evaluations are following the management cycle for continuous improvement and becoming more useful for making key decisions.
1.48 Recommendation. Agriculture and Agri-food Canada, Canadian Heritage, Citizenship and Immigration Canada, Fisheries and Oceans Canada, and Human Resources and Skills Development Canada should implement systematic processes to determine whether their effectiveness evaluations are meeting government-wide requirements and internal corporate needs, and act on areas identified for improvement. The Treasury Board of Canada Secretariat should monitor and provide any additional support it considers necessary for the implementation of these processes.
Agriculture and Agri-Food Canada’s response. Agreed. The Department accepts this recommendation and notes that in the past year, it has introduced a number of systematic processes, which together will ensure effectiveness evaluations address senior management information needs in a timely manner. They include annual consultations on evaluation priorities, requests for feedback on completed evaluations, and annual Head of Evaluation reports on the performance of the evaluation function.
Canadian Heritage’s response. Agreed. The Department recognizes the need to establish systematic processes to assess whether effectiveness evaluations are addressing needs. The Evaluation Services Directorate is already developing a performance measurement framework and management strategy to identify clear performance expectations and standards for the Department’s evaluation function. A systematic process to collect, analyze, and report on performance data and client satisfaction will be implemented in order to identify areas of improvements. Based on data collected during the first year of implementation (the 2010–11 fiscal year), periodic reporting to the Departmental Evaluation Committee should begin by the 2011–12 fiscal year. This performance information will complement data already reported in the context of the annual management accountability framework assessments conducted by the Treasury Board of Canada Secretariat.
Citizenship and Immigration Canada’s response. Agreed. The Department will assess the need for systematic processes in addition to the Management Accountability Framework assessment process, the oversight of the Departmental Evaluation Committee, requirements for annual reporting, and the Department’s evaluation process, which includes several steps of consultation and feedback from the Department’s branches.
Fisheries and Oceans Canada’s response. Agreed. The recommendation echoes the intent of the 2009 Policy on Evaluation. As such, actions are already under way within the Department to implement a systematic process to determine whether effectiveness evaluations are meeting internal corporate needs and government-wide needs (i.e., a strategic review) and to act on areas identified for improvement. Further work in this area will be completed in the context of guidance provided by the Treasury Board of Canada Secretariat.
Human Resources and Skills Development Canada’s response. Agreed. The Department currently employs a variety of systematic processes to ensure the quality and relevance of its evaluations for both internal and government-wide needs. External peer reviewers and evaluation advisory committees are a mandatory element of the evaluation work to ensure that evaluations are meeting information needs. Further work in systematically determining whether evaluations are meeting government and senior management needs will be developed, building upon the current annual Management Accountability Framework assessment process led by the Treasury Board of Canada Secretariat. Areas for improvement will be identified and reported to the Departmental Evaluation Committee.
The Treasury Board of Canada Secretariat’s response. Agreed. The Secretariat agrees that it should assist departments as necessary in their implementation of processes to determine whether evaluations are meeting government-wide needs and should provide support to departments that it considers necessary.
The new Treasury Board Policy on Evaluation, which came into effect on 1 April 2009, includes specific requirements to enhance evaluation coverage and examination of program effectiveness as well as performance information to support evaluations. The policy calls on the Secretary of the Treasury Board to provide functional leadership for evaluation across government, including monitoring and reporting annually to the Treasury Board on the health of the evaluation function.
The Secretariat currently carries out a large portion of this work through the annual Management Accountability Framework assessment process, which the Office of the Auditor General acknowledges was not covered by the current audit. The Secretariat communicates recommended areas for improvement in evaluation through the assessment reports that it sends to the deputy heads of departments and agencies, who are responsible for the evaluation function in their respective organizations.
Capacity for effectiveness evaluation
1.49 We define capacity as sufficient qualified evaluation staff and funding to meet needs for effectiveness evaluation. We examined staffing and funding for program evaluation in the six departments over the five-year period audited to determine whether they were able to hire enough staff and to address areas for improvement.
1.50 We found that, despite having increased funding and staffing, the audited departments found it challenging to hire enough qualified, experienced evaluation staff to meet needs for effectiveness evaluation, and they had not been able to regularly address areas for improvement.
Funding increases have enhanced departments’ capacity for evaluation
1.51 Evaluation unit funding increased over the audit period in all departments audited except Fisheries and Oceans Canada (Exhibit 1.6). This includes funding from the Treasury Board that is intended to enable departments to implement the 2001 Evaluation Policy, and funding provided following the enactment of the Federal Accountability Act for the evaluation of all ongoing grant and contribution programs.
Exhibit 1.6—Evaluation unit funding increased in most departments
Department
Unit funding in 2004–05
Unit funding in 2008–09
Agriculture and Agri-Food Canada
$1,029,000
$1,894,000
Canadian Heritage
$2,499,000
$3,123,000
Citizenship and Immigration Canada
$650,000
$1,987,000
Environment Canada
$732,000
$1,383,000
Fisheries and Oceans Canada
$1,248,000
$1,162,000
Human Resources and Skills Development Canada
$10,750,000
$13,924,000
Source: Data collection forms completed by departments
Shortage of experienced evaluators continues
1.52 The shortage of experienced program evaluators in the federal government is a long-standing concern. It has been noted in past Office of the Auditor General audits and in diagnostic studies by the Treasury Board of Canada Secretariat, and it was the subject of recent discussions within the federal evaluation community. A 2005 report by the Secretariat Centre of Excellence for Evaluation stated that “[t]he scarcity of evaluation personnel is probably the number one issue facing Heads of Evaluation.”
1.53 We found that during the period covered by this audit, the number of professional staff working in the evaluation units in the six departments had increased substantially (Exhibit 1.7). This trend was also evident in our analysis of descriptive data provided by the remaining large federal departments and agencies, which we did not audit.
Exhibit 1.7—The estimated number of evaluation unit professional staff increased in each department between the 2004–05 and 2008–09 fiscal years
Department
Number of professional staff in 2004–05*
Number of professional staff in 2008–09*
Agriculture and Agri-Food Canada
6.0
11.0
Canadian Heritage
8.1
13.0
Citizenship and Immigration Canada
3.3
12.5
Environment Canada
4.0
10.0
Fisheries and Oceans Canada
4.0
7.0
Human Resources and Skills Development Canada
44.5
54.0
Other large departments
176.3
296.7
* Full-time equivalents
Source: Data collection forms completed by departments
1.54 According to officials in the six departments, despite these increases in both funding and staff, it remains a challenge to find experienced evaluators, particularly at the senior levels. In their view, the shortage of experienced evaluators has affected their ability to hire the people they need. For example, in one collective staffing process, the pool of experienced evaluators was depleted before the demand was met. They also indicated that the shortage of experienced evaluators has led to evaluators being hired away by other federal evaluation units.
Evaluator competencies are not adequately defined
1.55 In diagnostic studies carried out by the Secretariat in 2005, deputy heads of departments identified the shortage of qualified evaluators as contributing to the inconsistent quality of evaluations. Competency profiles can help address this gap, by identifying training and development needs and by informing staffing efforts. We examined whether the six departments had addressed the challenge of finding the right people to do the work, by developing competency profiles for their evaluation unit staff.
1.56 Officials told us that they had begun to develop competency profiles. However, these efforts were discontinued, while the Secretariat undertook related work that recently resulted in a draft competency profile. The development of these profiles has been hindered by the lack of agreement about the required competencies of evaluators.
Other responsibilities of evaluation units put pressure on capacity
1.57 According to the Secretariat, evaluation units have assumed a number of responsibilities in addition to the traditional evaluation studies, including
preparing evaluation planning reports and assessments,
developing results-based management accountability frameworks, and
providing advice and training to program managers on evaluation and performance measurement.
1.58 We interviewed officials in the six audited departments to find out how much time is spent conducting effectiveness evaluations. We found that Environment Canada is the only department that conducts formal time recording of evaluation unit tasks. It was also the only department that was able to provide us with detailed data, from which we determined that about 40 percent of its time was spent on tasks other than evaluation.
1.59 The estimates provided by the other departments indicated that they spent about the same amount of time on such tasks. While we recognize the potential value and importance of these other tasks, they nevertheless have an impact on the capacity of evaluation units to meet identified needs for effectiveness evaluation.
Departments use contractors extensively
1.60 Another diagnostic study reported that a sample of deputy heads thought that evaluation units could make a more significant contribution and could strengthen staff capacity, by conducting more evaluations in-house. According to this study, deputy heads were concerned that contracting out evaluation studies prevents evaluation units from becoming their department’s subject matter experts, since much of what is learned remains with the contractor and not with the unit.
1.61 We looked at the evaluations conducted by the six audited departments during the audited period, between the 2004–05 and 2008–09 fiscal years, to determine whether they were conducted by contractors, in-house employees, or both. While about 90 percent of the evaluations were wholly or partially conducted by contractors (Exhibit 1.8), this varied among the audited departments. In Fisheries and Oceans Canada, for example, the figure was 37 percent, while in other departments the figures were close to 100 percent. This pattern was also evident in our analysis of descriptive data provided by the remaining large federal departments and agencies, which we did not audit.
Exhibit 1.8—Evaluations conducted by contractors in whole or in part in the audited departments
*Contractors’ data includes evaluations they conducted both in whole or in part.
1.62 Although officials recognized the value of developing in-house capacity, they also informed us that they required contractors for specific technical or subject matter expertise that was not feasible to maintain in-house.
Oversight and support
1.63 The Treasury Board of Canada Secretariat has described oversight as one of its central agency roles. Oversight includes policy development, monitoring, and reporting on management and budgetary performance within government. The Secretariat is responsible for the oversight of management policy development and the financial oversight of expenditure management. The Secretariat’s other central agency roles are leadership in setting the agenda for management, and helping departments and agencies improve their performance.
1.64 Under the 2001 Evaluation Policy, the Secretariat was required to provide central direction for the evaluation function by
establishing a Centre of Excellence for Evaluation to provide leadership, guidance, and support to the practice of evaluation;
using evaluation results, where appropriate, in decision making at the Centre;
setting standards; and
monitoring evaluation capacity in the government.
The Secretariat has identified a number of improvements in evaluation
1.65 We expected the Secretariat to support government-wide evaluation practices by identifying needed improvements and determining and carrying out actions required of the Secretariat, to help to ensure that departments and agencies have the tools they need to achieve the desired results.
1.66 In 2004, the Secretariat did an interim evaluation of the 2001 Evaluation Policy and found gaps in budget and human resources. This evaluation called for the Centre of Excellence for Evaluation to play a leadership role in helping the evaluation community, by
advocating the importance of evaluation to senior managers for decision making;
continuing to help with system-wide capacity building;
continuing to develop training, tools, and guides to support policy implementation; and
identifying best practices for the evaluation community.
1.67 Between the 2004–05 and 2006–07 fiscal years, the Secretariat also carried out several diagnostic studies that included interviews with deputy heads, clients, and stakeholders, as well as inquiries into the professionalism and the overall role of evaluation in the federal government. These studies identified necessary improvements.
1.68 We noted a number of specific initiatives related to evaluation and diagnostic studies, including new guidance and help given to departments for recruiting and training evaluators.
The Secretariat carried out extensive monitoring of the evaluation function
1.69 By the 2004–05 fiscal year, the Secretariat had developed several monitoring tools to collect information about the evaluation function across government. Secretariat officials identified similar monitoring activities in 2009:
annual capacity assessment survey;
Evaluation Information Resource Component—a database;
periodic in-depth review of the quality of evaluation reports;
ongoing reviews of evaluations, results-based management accountability frameworks, and departmental evaluation plans;
department and agency visits and interaction;
feedback to individual entities;
communication of best practices in evaluation methodology and reporting, and in managing the evaluation function;
review of the evaluation content of departmental Estimates documents (Report on Plans and Priorities and the Departmental Performance Report); and
Management Accountability Framework assessments.
1.70 The Secretariat describes the Management Accountability Framework as one of several tools it uses to assess management performance in departments. Although we did not audit the Framework, we noted that the Secretariat has used it in several ways, including changing the Framework itself, for example, by refining the assessment ratings. Program sector analysts and analysts at the Centre of Excellence for Evaluation also cited the knowledge gained from the Framework’s assessments as potentially helpful, as it could be used to improve guidance and tools, for example, in the development of the Standard on Evaluation that was issued with the 2009 Policy on Evaluation. However, we also noted that, because this knowledge was not always documented, its impact was not always clear.
1.71 The Secretariat’s first (and, to date, only) published report on the evaluation function—The Health of the Evaluation Function in the Government of Canada Report for Fiscal Year 2004–05—was based on the first of the annual capacity assessment surveys. These surveys are used to collect information on evaluation infrastructure, resources, production, and results and, for larger departments, to collect information on evaluation planning and resource requirements. The Secretariat has conducted the capacity assessment survey every year since the 2004–05 fiscal year.
1.72 The diagnostic studies, the Health of the Evaluation Function Report, and ongoing monitoring provided an analytical base for the first Government of Canada Evaluation Plan, 2005–06, which was developed in September 2005. Once again, this was the only one of its kind; no other such plan appeared in the period ending 31 March 2009.
1.73 The Secretariat carried out extensive monitoring. The value of this monitoring, of annually reporting on the evaluation function’s health (see paragraph 1.89), and of developing a government-wide plan, were recognized as requirements in the 2009 Policy on Evaluation, as well as in earlier drafts of the policy. However, over the period of our audit, the Secretariat developed only one such report and plan. For the purposes of continuous improvement, it will be important for the Secretariat to pursue these oversight activities, with departments, on a regular and systematic basis.
Sustained support for effectiveness evaluation is lacking
1.74 The Treasury Board’s 2001 Policy on Evaluation advised that evaluations consider program relevance, success, and cost-effectiveness. We therefore expected the Treasury Board of Canada Secretariat to provide guidance to departments and to help them identify these elements and improve effectiveness evaluation.
1.75 One area the Secretariat has clearly identified is the need to improve the use of evaluation for expenditure review. In 2005, a discussion document from the President of the Treasury Board, Management in the Government of Canada: A Commitment to Continuous Improvement, expressed concern that evaluation must become more directly linked to decisions regarding resource allocation.
1.76 In our audit of the expenditure management system (EMS) (November 2006 Report of the Auditor General of Canada, Chapter 1—Expenditure Management System at the Government Centre), we recommended a systematic review of the relevance and value-for-money of ongoing programs. The Public Accounts Committee pursued this issue in 2008 by recommending that the Secretariat develop an action plan to hire and train the necessary evaluators and that it reinforce the importance of evaluation as a key requirement in the EMS.
1.77 The government accepted these recommendations and launched the new expenditure management system that included strategic review. It viewed evaluation as the primary source of neutral and systematic information on the ongoing relevance and performance of polices and programs. In light of these developments, we looked for the Secretariat’s actions and initiatives in support of effectiveness evaluation.
1.78 We noted an initiative to develop a value-for-money tool for evaluation, launched as a pilot project in 2006. However, this initiative came to an abrupt halt in 2008 and, over the period of our audit, did not move beyond the pilot stage. The Secretariat found that the tool was being used to meet minimum data requirements, rather than the new policy’s objective of examining effectiveness issues. As we completed our audit, Secretariat officials informed us that the project is continuing its development and is being revised to take into consideration the requirements of the new policy on evaluation.
1.79 The Centre of Excellence for Evaluation has undertaken a number of similar initiatives related to capacity development, but during our audit, we found that the results of these initiatives are unclear. We noted policy requirements and related initiatives for evaluators to support the measurement of program performance. However, we did not find that the Secretariat had made progress in developing tools to help departments address the long-standing problem of insufficient data for the evaluation of effectiveness.
1.80 While the Secretariat recognized the value of effectiveness evaluation, particularly for expenditure management, it did not issue, over the period of our audit, adequate guidance or tools to support effectiveness evaluation.
1.81 The renewal of the 2001 Policy on Evaluation took far longer than expected. We found documentation from 2006, 2007, and 2008 indicating the policy would be completed in each of these years. In fact, the new policy took effect only on 1 April 2009. However, at the time of our audit, despite the three years that were spent developing the policy, the Secretariat had not issued guidance to departments on its implementation.
1.82 Recommendation. In developing tools, guidance, and support for departments, the Treasury Board of Canada Secretariat should regularly identify gaps that it needs to act on, develop plans to address these gaps, and act on these plans.
The Treasury Board of Canada Secretariat’s response. Agreed. The Secretariat is presently developing guidance for departments and agencies to support the implementation of the new Treasury Board Policy on Evaluation (April 2009). Guidance in a number of key areas, including departmental evaluation planning and performance measurement strategies, is expected to be available to departments and agencies by 31 March 2010. Further guidance will be issued over the course of the 2010–11 fiscal year.
During the past two years, the Treasury Board of Canada Secretariat has issued guidance on the development of performance measurement frameworks required under the 2005 Policy on Management Resources and Results Structures, which was not covered by the current audit. In addition, through its annual Management Accountability Framework assessments, the Secretariat provides advice and support to departments on the quality and use of evaluation as well as on the quality of performance measurement frameworks associated with departmental Management, Resources and Results Structures.
The Secretariat continually consults with departments and agencies on their needs and monitors policy implementation to identify weaknesses within the government-wide evaluation function. Where the Secretariat determines a need to develop further tools, guidance, or other supports, it includes these activities in its business plans.
Oversight and support require experienced staff
1.83 We examined whether the Treasury Board of Canada Secretariat had the human and financial resources needed for government-wide oversight and support of the program evaluation function.
1.84 The Centre of Excellence for Evaluation’s major responsibilities include
renewing the evaluation policy;
acting as a policy centre, by providing advice and analysis to support the expenditure management system;
conducting Management Accountability Framework assessments of the evaluation function; and
providing support for the evaluation function across government.
1.85 In addition, the Centre monitors the evaluation policy. Many of these tasks are analytical and call for experienced personnel. The review of evaluations, program accountability frameworks, and submissions requires sufficient expertise to provide recommendations and guidance to program sector analysts as well as to the Treasury Board itself.
1.86 We compiled information on the Centre’s workload and compared it to the resources allocated over the audit period. The funding for oversight work is largely salary-based. For the period of our audit, the staff complement of the Centre varied in size, from only 8 in the 2005–06 and 2006–07 fiscal years to 12 in the 2008–09 fiscal year. Overall, the staff levels during the 2005–06 to 2008–09 period were lower than the 15 on staff in the 2004–05 fiscal year, even though the Centre workload pertaining to its oversight activities was increasing. For example, the number of Treasury Board submissions reviewed by the Centre almost doubled from the 2004–05 to 2008–09 fiscal years.
1.87 The Secretariat clearly requires experienced analysts with appropriate expertise in evaluation in order to meet workload demands. The limited number of staff allocated to these functions during the audited period may have contributed to the lack of sustained support for effectiveness evaluation.
1.88 Recommendation. The Treasury Board of Canada Secretariat should ensure that it allocates sufficient resources to tasks that require evaluation expertise.
The Treasury Board of Canada Secretariat’s response. Agreed. In renewing the Policy on Evaluation, the Government has strengthened its commitment to evaluating the value for money of federal programs and reaffirmed the Secretariat’s role of leading the evaluation function. In performing its functional leadership role established in the new policy, the Treasury Board of Canada Secretariat will ensure that the resources necessary for performing this role are considered and sufficient resources are allocated at the Secretariat to tasks that require evaluation expertise.
Care is needed in the implementation of the new coverage requirements
1.89 By 2006, the Secretariat had completed its diagnostic work and issued its Health of the Evaluation Function Report. The Secretariat knew the challenges for the evaluation function associated with the legal requirement that all ongoing grant and contribution programs (a part of direct program spending) be evaluated, and that departments were hard pressed to meet it. The requirement was especially challenging for the audited departments that have a high proportion of such programs, but additional central funding was provided in 2007 to support meeting this coverage requirement. In particular, as the Secretariat itself noted, requiring full coverage made it more difficult to target evaluation efforts on the basis of risks. When the legal requirement was enacted in 2006, it created pressure for 100 percent coverage of all direct program spending, in parallel with the later reform of the expenditure management system.
1.90 We noted that deputy heads of departments remain responsible for implementing the 2009 evaluation policy, including the expanded coverage requirements. It will be important for the Secretariat to work with departments to ensure that they are fully prepared to implement these coverage requirements, in order to meet the expectations set out in the new evaluation policy.
1.91 The implementation of the new coverage requirement faces serious challenges. Earlier requirements for full coverage were never met. Current legal requirements for effectiveness evaluation of all grants and contributions programs have been difficult to meet, and department officials told us that they have concerns about their capacity to respond to these requirements. Moreover, we found a shortage of experienced evaluators and extensive use of contractors over the period audited. For example, Environment Canada estimated that it would have to double the complement of its evaluation unit over the next four years, or sacrifice evaluation depth in order to achieve full coverage.
1.92 In our view, it will be important for the Secretariat and departments to carry out effectiveness evaluation of programs that are susceptible to significant change because of shifting priorities and circumstances. These are programs where evaluations of the relevance, impact, and achievement of objectives can be put to best use. During the transition to full coverage, these programs may present the biggest opportunities for effectiveness evaluation.
1.93 Recommendation. The Treasury Board of Canada Secretariat should help departments prepare to implement the new coverage requirements. During the transition period, the Secretariat should provide advice and guidance for effectiveness evaluation, focusing on programs where such evaluation can be put to best use.
The Treasury Board of Canada Secretariat’s response. Agreed. The Secretariat is planning to issue written guidance for making risk-based choices for evaluation coverage to support departments during the transition period. This guidance is expected by 31 March 2010.
Throughout the transition period, the Secretariat will also help departments prepare to implement the new coverage requirements that come into effect after 31 March 2013. The Secretariat will provide leadership in the development and sharing of effective evaluation practices across departments, as well as support capacity-building initiatives in the evaluation function government-wide.
Conclusion
1.94 The six departments included in this audit—Agriculture and Agri-food Canada, Canadian Heritage, Citizenship and Immigration Canada, Environment Canada, Fisheries and Oceans Canada, and Human Resources and Skills Development Canada—followed systematic processes to plan their effectiveness evaluations. As well, most planned evaluations were completed. However, the evaluations that were conducted by these departments only covered a low proportion of overall departmental expenses, and most of the evaluations we examined were hampered by inadequate data. As a result, departments were not able to demonstrate that they were sufficiently meeting needs for effectiveness evaluation.
1.95 Based on a sample of effectiveness evaluations, we found that the audited departments often did not have the necessary performance information to evaluate whether programs are effective.
1.96 Moreover, with the exception of Environment Canada, which has processes in place to identify needed improvements, the audited departments did not demonstrate that they had regularly identified and addressed areas for improvement in effectiveness evaluation during the audit period. Such a cycle of continuous improvement would steadily add value to effectiveness evaluation.
1.97 The departments we examined expressed concerns about their capacity to implement evaluation of all direct program spending, as required under the 2009 Policy on Evaluation. Even before these expanded requirements, they found it challenging to hire enough experienced evaluators to fully meet needs for effectiveness evaluation, and they had not been able to regularly address areas for improvement. In our view, identifying programs where effectiveness information can be put to the best use will be a key part of implementing the coverage requirements of this policy.
1.98 Over the past five years, the Treasury Board of Canada Secretariat has introduced initiatives to address improvements in evaluation. However, support for effectiveness evaluation, which is an important area for the Secretariat, did not receive sustained attention. While the Secretariat did regularly identify areas for improvement, it did not provide adequate guidance.
1.99 Overall, we found that, in the six departments we audited, needs for effectiveness evaluation were not being adequately met. Improvements are required in departments, and in the oversight and support activities of Treasury Board of Canada Secretariat, in order to remedy the situation.
1.100 These findings are similar to many reported by the Office in previous audits of program evaluation. Taken together, they raise basic questions about effectiveness evaluation in the federal government.
1.101 In our view, the federal evaluation function is at a crossroads. A vital public purpose is served when effectiveness evaluation informs the important decisions that Canadians are facing. Departments face greater expectations than ever before and are taking on added responsibilities. Much remains to be done to meet the challenge. Continuous improvement is the way forward.
About the Audit
All of the audit work in this chapter was conducted in accordance with the standards for assurance engagements set by The Canadian Institute of Chartered Accountants. While the Office adopts these standards as the minimum requirement for our audits, we also draw upon the standards and practices of other disciplines.
Objectives
The overall objective of this audit was to determine whether selected departments and the Treasury Board of Canada Secretariat are meeting the needs for effectiveness evaluation and are identifying and making improvements in effectiveness evaluation.
The audit objectives for the three lines of enquiry were as follows:
Determine whether selected departments can demonstrate that they are meeting needs for effectiveness evaluation and regularly identify and address areas for improvement.
Determine whether selected departments can demonstrate that they have the capacity to meet key needs for effectiveness evaluation and regularly identify and address areas for improvement.
Determine whether the Treasury Board of Canada Secretariat’s government-wide oversight of the program evaluation function has regularly identified and addressed areas for improvement that ensure that departments have the capacity to meet needs for effectiveness evaluation.
Scope and approach
Focus on effectiveness evaluation. Based on the Auditor General Act, section 7(2)(e), this audit examined program evaluation government-wide, in relation to the measurement of program effectiveness (meaning the assessment of the extent to which programs are relevant, produce impacts and outcomes, achieve their objectives, and are cost effective). Other types of evaluation examine program implementation and management.
Not a compliance audit. Because the 2001 Treasury Board evaluation policy was replaced in April 2009, the audit did not focus on compliance with policy.
Evaluation quality. We examined evaluation quality by determining whether departments had processes in place, including quality assurance, to ensure that their effectiveness evaluations are appropriate for their intended uses.
Strategic review and the Expenditure Management System. In view of the link between the Expenditure Management System and evaluation (that is, evaluation is seen as a key source of information for strategic review), the audit sought to determine whether evaluations met the need created by strategic review.
Selection of entities. The entities selected for examination in the audit were the Treasury Board of Canada Secretariat and the following six departments:
Agriculture and Agri-Food Canada,
Canadian Heritage,
Citizenship and Immigration Canada,
Environment Canada,
Fisheries and Oceans Canada, and
Human Resources and Skills Development Canada.
The selection of departments was based on factors such as materiality, range of program types, nature of the evaluation function, and management accountability framework (MAF) ratings and whether a strategic review had been carried out. Our audit also included descriptive information provided by other large departments (those participating in the annual MAF process) who attested to the accuracy of information they provided.
During our audit, we conducted interviews, reviewed files and documents, analyzed descriptive information provided and attested to by large departments, and met with focus groups who provided us with an informed stakeholder perspective.
Period covered by the audit
The audit period was between the 2004–05 and 2008–09 fiscal years. This period was chosen because enough time would have elapsed since the introduction of the 2001 policy to allow its effects to take hold. In addition, the 2004–05 fiscal year was when the Treasury Board of Canada Secretariat carried out an interim evaluation of the 2001 policy. It was also the year when the Secretariat engaged contractors to complete a series of diagnostic studies aimed at understanding the “state of play” of the function at that time. Focusing on the period between the 2005–06 and 2008–09 fiscal years allowed us to examine the impact of the Secretariat’s efforts to understand how well the 2001 policy was working, and to examine how the Secretariat identified problems between the 2002–03 and 2004–05 fiscal years and addressed these problems between the 2004–05 and 2008–09 fiscal years.
Audit work for this chapter was substantially completed on 31 May 2009.
Criteria
Listed below are the criteria that were used to conduct this audit and their sources.
Criteria
Sources
We expected that departments could demonstrate that program evaluation plans take appropriate account of needs for effectiveness evaluation.
Auditor General Act, section 7(2)(e)
Financial Administration Act, section 42(1)
Treasury Board Evaluation Policy (2001)
Treasury Board of Canada Secretariat, Centre of Excellence for Evaluation, Evaluation Function in the Government of Canada (2004), Appendix 2, Evaluation Standards in the Government of Canada
Treasury Board of Canada Secretariat, Centre of Excellence for Evaluation, A Guide to Developing a Risk-Based Departmental Evaluation Plan (2005), section 3.2
Treasury Board Policy on Transfer Payments (2008)
We expected that departments could demonstrate that they have acted on program evaluation plans to meet key needs.
Treasury Board Evaluation Policy (2001)
Treasury Board of Canada Secretariat, Centre of Excellence for Evaluation, A Guide to Developing a Risk-Based Departmental Evaluation Plan (2005), page 14
We expected that departments could demonstrate that their effectiveness evaluations appropriately meet identified needs.
Treasury Board Evaluation Policy (2001)
Treasury Board of Canada Secretariat, Centre of Excellence for Evaluation, Evaluation Function in the Government of Canada (2004), Appendix 2, Evaluation Standards in the Government of Canada
We expected that departments could demonstrate that they regularly identify and act on required improvements in meeting needs for effectiveness evaluation.
Treasury Board Evaluation Policy (2001)
Treasury Board of Canada Secretariat, Centre of Excellence for Evaluation, Evaluation Function in the Government of Canada (2004), Appendix 2, Evaluation Standards in the Government of Canada
We expected that departments could demonstrate reasonable efforts to ensure sufficient qualified evaluation staff to meet key needs for effectiveness evaluation.
Treasury Board Evaluation Policy (2001)
Treasury Board of Canada Secretariat, Centre of Excellence for Evaluation, Evaluation Function in the Government of Canada (2004), Appendix 2, Evaluation Standards in the Government of Canada
Treasury Board of Canada Secretariat, People Component of the Management Accountability Framework (PCMAF) (2005), page 1
Public Service Commission of Canada, Staffing Management Accountability Framework (SMAF) (2005), page 4
Government Response to the Fourth Report on the Standing Committee on Public Accounts: The Expenditure Management System at the Government Centre and the Expenditure Management System in Departments (2008), page 8
We expected departments could demonstrate that the amount and the time frame of funding for effectiveness evaluation meet key needs.
Treasury Board of Canada Secretariat, Centre of Excellence for Evaluation, A Guide to Developing a Risk-Based Departmental Evaluation Plan (2005), pages 5 and 13
We expected that departments could demonstrate that evaluators have sufficient independence from program managers and that their objectivity is not hindered.
Treasury Board Evaluation Policy (2001)
Treasury Board of Canada Secretariat, Centre of Excellence for Evaluation, Evaluation Function in the Government of Canada (2004), Appendix 2, Evaluation Standards in the Government of Canada
We expected that departments could demonstrate that they regularly identify and act on required improvements to capacity to meet needs for effectiveness evaluation.
Treasury Board Evaluation Policy (2001)
Public Service Commission of Canada, Staffing Management Accountability Framework (SMAF), page 5
We expected that the Treasury Board of Canada Secretariat has the resources required for government-wide oversight of the program evaluation function.
Financial Administration Act, section 6(7)
We expected that the Treasury Board of Canada Secretariat could support the practice of government-wide evaluation by identifying needed improvements and determining and carrying out actions required of the Secretariat to help ensure that departments and agencies have the tools they need to achieve the desired results.
Treasury Board Evaluation Policy (2001)
The Standing Committee on Public Accounts, Report on the Expenditure Management System at the Government Centre and the Expenditure Management System in Departments (2008), page 16
Management reviewed and accepted the suitability of the criteria used in the audit.
Audit team
Assistant Auditor General: Neil MaxwellPrincipal: Tom WilemanLead Director: Colin MeredithDirectors: Doreen DeveenLeslie Levita
Irene AndayoHelene CharestJeff GrahamKrista HilgeChandrawattie Samaroo
For information, please contact Communications at 613-995-3708 or 1-888-761-5953 (toll-free).
Appendix—List of recommendations
The following is a list of recommendations found in Chapter 1. The number in front of the recommendation indicates the paragraph where it appears in the chapter. The numbers in parentheses indicate the paragraphs where the topic is discussed.
Recommendation
Response
Meeting needs for effectiveness evaluation
1.37 Agriculture and Agri-food Canada, Canadian Heritage, Citizenship and Immigration Canada, Environment Canada, Fisheries and Oceans Canada, and Human Resources and Skills Development Canada should develop and implement action plans to ensure that ongoing program performance information is collected to support effectiveness evaluation. (1.14–1.36)
Agriculture and Agri-Food Canada’s response. Agreed. The Department agrees that the systematic collection of program performance data by managers is necessary to report on program performance and to support effectiveness evaluations.
As required by the Treasury Board Policy on Transfer Payments, a performance measurement strategy for ongoing management of transfer payment programs, including performance measures and indicators and a data collection strategy, is developed for each new transfer payment program.
The Department’s evaluation function reviews program performance measurement strategies as they are developed to ensure that outcomes are defined, measurable, and attributable and that the strategies, if implemented, are sufficient to support future evaluation work.
Beginning in the 2009–10 fiscal year, the Department will conduct annual state of performance measurement reviews. The first such review will assess performance measurement practices at the Department and the adequacy of data collected for programs soon to be evaluated. An action plan to address its recommendations will be developed and its implementation monitored with a view to strengthening the Department’s practices in this area.
Canadian Heritage’s response. Agreed. The Department is executing its Action Plan for Implementation of the Management, Resources and Results Structures (MRRS) Policy to ensure that program staff are able to fulfill their responsibility for developing and maintaining performance measurement strategies. Action plan measures include
the provision of information sessions and workshops,
the establishment of indicators and corresponding targets,
the development of robust methodologies to demonstrate outcomes,
the establishment of a centre of expertise on performance measurement within the Department,
the design and implementation of adequate tools and guidelines,
the establishment of relevant information technology and systems, and
the regular analysis and reporting of collected data.
This action plan is expected to be completed by end of the 2011–12 fiscal year.
The Department’s Office of the Chief Audit and Evaluation Executive is continually providing advice and support to Department managers in their efforts to implement this action plan. In line with the Transfer Payment Policy, this office is also providing timely advice and support on program design and performance measurement strategies through the review of official approval documents for the creation and renewal of new or existing programs. Finally, as required under the new Evaluation Policy, the office will be submitting, in the 2010–11 fiscal year, its first annual report to the departmental evaluation committee on the state of performance measurement in the Department.
Citizenship and Immigration Canada’s response. Agreed. The Department recognizes the value of ongoing performance information for evaluation and will continue to support related departmental activities. The Department will develop an action plan that includes the renewal of the comprehensive departmental performance measurement framework and program activity architecture, and furthers the integration of the Framework into the business planning process.
Environment Canada’s response. Agreed. The Department accepts this recommendation, which echoes the intent of the 2009 Evaluation Policy related to performance information. As such, actions are already under way within Environment Canada to implement this recommendation. These include ongoing monitoring of the implementation of management responses to previous evaluations that have identified concerns with performance information, and the development and implementation of a strategy to inform all department managers of the Evaluation Policy requirements pertaining to performance measurement.
In addition, the Department’s evaluation plan will be expanded to include a monitoring component to verify, within available resources, the status of performance data collected in the department and whether sufficient performance information will be available to support upcoming evaluations. This monitoring component will be included in the 2010–15 Evaluation Plan and will be updated annually thereafter.
Further, for the 2010–11 fiscal year, the Department’s performance measurement framework has been linked to the Department’s program activity architecture, in that performance measures have been identified for all programs.
Fisheries and Oceans Canada’s response. Agreed. The Department’s performance measurement framework links its core indicators to the departmental program activity architecture (PAA), thus identifying performance measures for all program activities and sub-activities. Each fiscal year, the Department conducts an analysis of the state of performance measurement in the Department and provides an annual report to the Departmental Evaluation Committee. In addition, the Department will develop and implement an action plan to ensure that ongoing program performance information is collected to support effectiveness evaluation by the end of August 2010.
Human Resources and Skills Development Canada’s response. Agreed. The Department accepts this recommendation that echoes the intent of various Treasury Board of Canada Secretariat policies, including the 2009 Evaluation Policy. As such, the Department already gathers and monitors ongoing performance information to support effectiveness evaluation, including
monitoring implementation of management responses to previous evaluations that have identified data issues for current programs;
undertaking early evaluative work in advance of the formal initiation of effectiveness evaluations as part of the evaluation planning process, to review the state of performance data and logic models; and
monitoring the implementation of new programs (for example, Economic Action Plan initiatives) to ensure the necessary administrative and performance data are available to support future evaluation activities.
The Department is also undertaking a comprehensive review, refinement, and validation of its Performance Measurement Framework (PMF) to make it more robust and comprehensive, to support ongoing planning, monitoring, and managing for results and the Evaluation Directorate has been actively involved in this work. The progress made on the departmental PMF will support both performance monitoring as well as evaluation of relevance and effectiveness.
1.43 Agriculture and Agri-food Canada, Canadian Heritage, Citizenship and Immigration Canada, Environment Canada, Fisheries and Oceans Canada, and Human Resources and Skills Development Canada should consider the merits of including external experts on their departmental evaluation committees. The Treasury Board of Canada Secretariat should provide guidance to departments in this regard. (1.38–1.42)
Agriculture and Agri-Food Canada’s response. Agreed. The Department has recently introduced a number of practices to ensure production of strong evaluation reports. They include seeking input from external experts on evaluations in progress. The Department will also consider including external members on its Departmental Evaluation Committee, in the context of guidance provided by Treasury Board of Canada Secretariat.
Canadian Heritage’s response. Agreed. At the Department, oversight of the evaluation function is provided by the Strategic Policy, Planning and Evaluation Committee (SPPEC), chaired by the Deputy Head. This structure provides opportunities for enhanced integration between the policy, planning, and evaluation functions of the department. As well, the Department already brings key evaluation reports when necessary to its department audit committee for review and discussion. The Department will collaborate with the Treasury Board of Canada Secretariat’s Centre of Excellence in Evaluation to assess the value added of integrating the advice of external evaluation experts to inform the work of departmental evaluation committees.
Citizenship and Immigration Canada’s response. Agreed. The Department will assess the benefits of including an external evaluation expert on the departmental evaluation committee.
Environment Canada’s response. Agreed. The Department accepts the recommendation and will await guidance from the Treasury Board of Canada Secretariat on the inclusion of external members on departmental evaluation committees.
Fisheries and Oceans Canada’s response. Agreed. The Department will consider the merit of including external members on its department evaluation committee, in the context of guidance provided by the Treasury Board of Canada Secretariat.
Human Resources and Skills Development Canada’s response. Agreed. The Department has, in the past, included members from outside the Department on its departmental evaluation committee. The Department will reconsider the formal inclusion of external experts on the current Departmental Evaluation Committee and will look to the Treasury Board of Canada Secretariat for guidance on this part of the Evaluation Policy.
The Treasury Board of Canada Secretariat’s response. Agreed. The Secretariat agrees that it should provide guidance to departments on the possible merits of including external experts on their departmental evaluation committees. These actions will be completed by 31 March 2010.
1.48 Agriculture and Agri-food Canada, Canadian Heritage, Citizenship and Immigration Canada, Fisheries and Oceans Canada, and Human Resources and Skills Development Canada should implement systematic processes to determine whether their effectiveness evaluations are meeting government-wide requirements and internal corporate needs, and act on areas identified for improvement. The Treasury Board of Canada Secretariat should monitor and provide any additional support it considers necessary for the implementation of these processes. (1.44–1.47)
Agriculture and Agri-Food Canada’s response. Agreed. The Department accepts this recommendation and notes that in the past year, it has introduced a number of systematic processes, which together will ensure effectiveness evaluations address senior management information needs in a timely manner. They include annual consultations on evaluation priorities, requests for feedback on completed evaluations, and annual Head of Evaluation reports on the performance of the evaluation function.
Canadian Heritage’s response. Agreed. The Department recognizes the need to establish systematic processes to assess whether effectiveness evaluations are addressing needs. The Evaluation Services Directorate is already developing a performance measurement framework and management strategy to identify clear performance expectations and standards for the Department’s evaluation function. A systematic process to collect, analyze, and report on performance data and client satisfaction will be implemented in order to identify areas of improvements. Based on data collected during the first year of implementation (the 2010–11 fiscal year), periodic reporting to the Departmental Evaluation Committee should begin by the 2011–12 fiscal year. This performance information will complement data already reported in the context of the annual management accountability framework assessments conducted by the Treasury Board of Canada Secretariat.
Citizenship and Immigration Canada’s response. Agreed. The Department will assess the need for systematic processes in addition to the Management Accountability Framework assessment process, the oversight of the Departmental Evaluation Committee, requirements for annual reporting, and the Department’s evaluation process, which includes several steps of consultation and feedback from the Department’s branches.
Fisheries and Oceans Canada’s response. Agreed. The recommendation echoes the intent of the 2009 Policy on Evaluation. As such, actions are already under way within the Department to implement a systematic process to determine whether effectiveness evaluations are meeting internal corporate needs and government-wide needs (i.e., a strategic review) and to act on areas identified for improvement. Further work in this area will be completed in the context of guidance provided by the Treasury Board of Canada Secretariat.
Human Resources and Skills Development Canada’s response. Agreed. The Department currently employs a variety of systematic processes to ensure the quality and relevance of its evaluations for both internal and government-wide needs. External peer reviewers and evaluation advisory committees are a mandatory element of the evaluation work to ensure that evaluations are meeting information needs. Further work in systematically determining whether evaluations are meeting government and senior management needs will be developed, building upon the current annual Management Accountability Framework assessment process led by the Treasury Board of Canada Secretariat. Areas for improvement will be identified and reported to the Departmental Evaluation Committee.
The Treasury Board of Canada Secretariat’s response. Agreed. The Secretariat agrees that it should assist departments as necessary in their implementation of processes to determine whether evaluations are meeting government-wide needs and should provide support to departments that it considers necessary.
The new Treasury Board Policy on Evaluation, which came into effect on 1 April 2009, includes specific requirements to enhance evaluation coverage and examination of program effectiveness as well as performance information to support evaluations. The policy calls on the Secretary of the Treasury
Board to provide functional leadership for evaluation across government, including monitoring and reporting annually to the Treasury Board on the health of the evaluation function.
The Secretariat currently carries out a large portion of this work through the annual Management Accountability Framework assessment process, which the Office of the Auditor General acknowledges was not covered by the current audit. The Secretariat communicates recommended areas for improvement in evaluation through the assessment reports that it sends to the deputy heads of departments and agencies, who are responsible for the evaluation function in their respective organizations.
Oversight and support
1.82 In developing tools, guidance, and support for departments, the Treasury Board of Canada Secretariat should regularly identify gaps that it needs to act on, develop plans to address these gaps, and act on these plans. (1.74–1.81)
The Treasury Board of Canada Secretariat’s response. Agreed. The Secretariat is presently developing guidance for departments and agencies to support the implementation of the new Treasury Board Policy on Evaluation (April 2009). Guidance in a number of key areas, including departmental evaluation planning and performance measurement strategies, is expected to be available to departments and agencies by 31 March 2010. Further guidance will be issued over the course of the 2010–11 fiscal year.
During the past two years, the Treasury Board of Canada Secretariat has issued guidance on the development of performance measurement frameworks required under the 2005 Policy on Management Resources and Results Structures, which was not covered by the current audit. In addition, through its annual Management Accountability Framework assessments, the Secretariat provides advice and support to departments on the quality and use of evaluation as well as on the quality of performance measurement frameworks associated with departmental Management, Resources and Results Structures.
The Secretariat continually consults with departments and agencies on their needs and monitors policy implementation to identify weaknesses within the government-wide evaluation function. Where the Secretariat determines a need to develop further tools, guidance, or other supports, it includes these activities in its business plans.
1.88 The Treasury Board of Canada Secretariat should ensure that it allocates sufficient resources to tasks that require evaluation expertise. (1.83–1.87)
The Treasury Board of Canada Secretariat’s response. Agreed. In renewing the Policy on Evaluation, the Government has strengthened its commitment to evaluating the value for money of federal programs and reaffirmed the Secretariat’s role of leading the evaluation function. In performing its functional leadership role established in the new policy, the Treasury Board of Canada Secretariat will ensure that the resources necessary for performing this role are considered and sufficient resources are allocated at the Secretariat to tasks that require evaluation expertise.
1.93 The Treasury Board of Canada Secretariat should help departments prepare to implement the new coverage requirements. During the transition period, the Secretariat should provide advice and guidance for effectiveness evaluation, focusing on programs where such evaluation can be put to best use. (1.89–1.92)
The Treasury Board of Canada Secretariat’s response. Agreed. The Secretariat is planning to issue written guidance for making risk-based choices for evaluation coverage to support departments during the transition period. This guidance is expected by 31 March 2010.
Throughout the transition period, the Secretariat will also help departments prepare to implement the new coverage requirements that come into effect after 31 March 2013. The Secretariat will provide leadership in the development and sharing of effective evaluation practices across departments, as well as support capacity-building initiatives in the evaluation function government-wide.

Definitions:
Direct program spending—Includes operating and capital spending and grants and contributions, but does not include public debt charges and major transfers to persons or other levels of government. (Return)
Reference level—The amount of funding that the Treasury Board has approved for departments and agencies to carry out approved policies and programs for each year of the planning period. (Return)
Effectiveness evaluation—An assessment of the extent to which programs are relevant, produce impacts and outcomes, achieve their objectives, and are cost-effective. Other types of evaluation examine program implementation and management. (Return)
Non-statutory grant and contribution programs—Programs whose spending authority is provided in an appropriation act that is voted on in Parliament in the Main and Supplementary Estimates, as opposed to programs whose spending authority comes from other legislation. (Return)
-->