Tuesday, November 3, 2009

2009 Fall Report of the Auditor General of Canada .

Chapter 1—Evaluating the Effectiveness of Programs
Main Points
Introduction
Focus of the audit
Observations and Recommendations
Meeting needs for effectiveness evaluation
The need for effectiveness evaluations continues to grow
Departments have systematic processes to plan effectiveness evaluation
Expenditure management needs were not adequately considered
Departments complete most planned evaluations
Evaluation coverage of programs is limited
Evaluations do not adequately assess effectiveness
Some quality assurance processes are in place
Departments are not systematically identifying priorities for improvement
Capacity for effectiveness evaluation
Funding increases have enhanced departments’ capacity for evaluation
Shortage of experienced evaluators continues
Evaluator competencies are not adequately defined
Other responsibilities of evaluation units put pressure on capacity
Departments use contractors extensively
Oversight and support
The Secretariat has identified a number of improvements in evaluation
The Secretariat carried out extensive monitoring of the evaluation function
Sustained support for effectiveness evaluation is lacking
Oversight and support require experienced staff
Care is needed in the implementation of the new coverage requirements
Conclusion
About the Audit
Appendix—List of recommendations
Exhibits:
1.1—The history of federal program evaluation reveals many audit observations that are critical of government initiatives
1.2—Government-wide requirements for effectiveness evaluation appear in both statutes and policies
1.3—Most of the planned evaluations were completed
1.4—A low proportion of total program expenses were evaluated between the 2004–05 and 2007–08 fiscal years
1.5—Some effectiveness evaluations have insufficient performance information
1.6—Evaluation unit funding increased in most departments
1.7—The estimated number of evaluation unit professional staff increased in each department between the 2004–05 and 2008–09 fiscal years
1.8—Evaluations conducted by contractors in whole or in part in the audited departments
Main Points
What we examined
Effectiveness evaluation is an established tool that uses systematic research methods drawn from many different disciplines to assess how well a program is achieving its objectives. When done well, it allows departments to develop evidence to determine how well their programs meet expectations, and whether they are cost-effective. Over the past 40 years, the federal government has made efforts to embed the management practice of evaluating program effectiveness, as an essential part of its support for program evaluation.
The 2006 Federal Accountability Act enacted into law a requirement that all grant and contribution programs be evaluated every five years. The new Policy on Evaluation that came into effect in 2009 extends the requirement for evaluation to cover all direct program spending over a five-year cycle.
We examined how evaluation units in six departments identify and respond to the various needs for effectiveness evaluations. We also looked at whether they have built the required capacity to respond to those needs. In addition, we looked at the oversight and support role of the Treasury Board of Canada Secretariat in monitoring and improving the evaluation function in the government, specifically with respect to effectiveness evaluations. The period covered by our audit was 2004 to 2009.
Why it’s important
Governments are under continual pressure to spend money on a range of programs designed to serve particular needs of society. While many factors affect the decisions that governments must ultimately make about programs, effectiveness evaluations can aid their decision making by providing objective and reliable information that helps identify programs that are working as intended; those that are no longer needed; and those that are not accomplishing the desired objectives and could be replaced by programs that will achieve the objectives more cost-effectively. In addition, effectiveness evaluation is expected to serve the information needs of parliamentarians.
One of the most important benefits of effectiveness evaluation is to help departments and agencies improve the extent to which their programs achieve their objectives. Departments need to demonstrate to Parliament and taxpayers that they are delivering results for Canadians with the money entrusted to them.
What we found
The six departments we examined followed systematic processes to plan their effectiveness evaluations and completed most of the evaluations they had planned. However, over the audited period, each department’s evaluations covered a relatively low proportion of its total program expenses—between five and thirteen percent annually across the six departments.
In effect, the rate of coverage was even lower because many of the effectiveness evaluations we reviewed did not adequately assess program effectiveness. Often departments have not gathered the performance information needed to evaluate whether programs are effective. Of the 23 evaluation reports we reviewed, 17 noted that the analysis was hampered by inadequate data, limiting the assessment of program effectiveness.
The departments we examined told us that it remains a challenge to find experienced evaluators, and they have made extensive use of contractors to meet requirements. Departments expressed concern about their capacity to start in 2013 to evaluate all direct program spending, as required by the 2009 Policy on Evaluation. To ensure full coverage (which includes grants and contributions), they will have to evaluate an average of 20 percent of their direct program spending each year of the five-year cycle.
The Treasury Board of Canada Secretariat introduced initiatives over the past few years to address the need for improvements in evaluation across the government. However, it did not provide sustained support for effectiveness evaluation. In particular, it made little progress on developing tools to assist departments with the long-standing problem of a lack of sufficient data for evaluating program effectiveness.
With the exception of Environment Canada, which has processes in place to identify needed improvements, the audited departments do not regularly identify and address weaknesses in effectiveness evaluation.
The Secretariat and the departments have responded. The Treasury Board of Canada Secretariat and the departments agree with our recommendations. Their detailed responses follow each recommendation throughout the chapter.
Introduction
1.1 The 2009 Treasury Board Policy on Evaluation defines evaluation as “the systematic collection and analysis of evidence on the outcomes of programs to make judgments about their relevance, performance and alternative ways to deliver programs or to achieve the same results.” The objective of this policy is to create a comprehensive and reliable base of evaluation evidence that is used to support policy and program improvement, expenditure management, Cabinet decision making, and public reporting.
1.2 Program evaluation is recognized as a key source of information on program effectiveness. This information is essential if senior officials are to base program and funding decisions on evidence of program effectiveness. Potential key users of evaluation findings include senior department officials, the managers of programs being evaluated, central agencies, and parliamentarians. The findings of program evaluation may also be of interest to program stakeholders and to the public.
1.3 Program evaluation has been practised in the federal government, in one form or another, for close to 40 years. The Office of the Auditor General examined program evaluation in 1978, 1983, 1986, 1993, 1996, and 2000. The history of federal program evaluation from 1970 to 2000 reveals repeated initiatives at the centre of government to establish and support the function, and often critical observations by the Office of the Auditor General on the success of these efforts (Exhibit 1.1).
Exhibit 1.1—The history of federal program evaluation reveals many audit observations that are critical of government initiatives
Year
Event
1970
The Treasury Board of Canada Secretariat established a Planning Branch to carry out interdepartmental studies of policy programs.
1977
The Treasury Board issued the first evaluation policy, which required departments to subject each program to effectiveness evaluation on a regular basis.
1978
The Auditor General’s Study of Procedures in Cost-Effectiveness reviewed 23 programs in 18 departments, and found few successful attempts to evaluate their effectiveness.
1981
The Office of the Comptroller General developed a policy framework to guide and structure departmental evaluation functions, and established a team of liaison officers to give guidance and advice to departments in its implementation.
1983
The Auditor General’s report found that although very real progress had been made in developing the program evaluation function since its 1978 study, few high-quality evaluations were being done.
1986
The Program Review carried out by the Nielsen Task Force found that the information received from program evaluations was “generally useless and inadequate.”
1986
The Auditor General’s follow-up to the 1983 audit concluded that there had been an improvement in the quality of methodology and reporting of program evaluation, but that problems regarding balance and full disclosure of limitations in methods still existed.
1993
The Auditor General found that “high expectations and great potential” had been only partly fulfilled: “The system’s results are often disappointing. Program evaluations frequently were not timely or relevant. Many large-expenditure programs had not been evaluated under the policy ...”
1996
The Auditor General’s follow-up to the 1993 audit found that little progress had been made to address effectiveness issues.
2000
The Auditor General found that the evaluation function had actually regressed, due in part to reductions in funding that undermined capacity.
1.4 The advent of results-based management. In 2000, the Treasury Board of Canada Secretariat issued Results for Canadians: A Management Framework for the Government of Canada. It outlined a results-based management approach, which states that
departments and agencies need to implement an information regime that measures, evaluates and reports on key aspects of programs and their performance in core areas; holds managers accountable for achieving results; and ensures unbiased analysis, showing both good and bad performance.
1.5 As we noted in our report from the same year (the December 2000 Report of the Auditor General of Canada, Chapter 20—Managing Departments for Results and Managing Horizontal Issues for Results), evaluation has an important role to play in managing for results. It can provide important information on program performance that is not gathered by ongoing monitoring systems, and can help managers understand why programs are working or not.
1.6 In 2001, the Treasury Board issued an evaluation policy, intended to reflect the results-based management philosophy expressed in Results for Canadians. The objective of this policy was to ensure that the government has information on the performance of its policies, programs, and initiatives that is timely, strategically focused, objective, and evidence-based. The Policy described evaluation as a management tool for the periodic assessment of a program’s effectiveness in achieving objectives, of its impact and relevance, and of alternative ways to achieve expected results (cost effectiveness).
1.7 In 2003, the Treasury Board of Canada Secretariat introduced the Management Accountability Framework, to assess management performance in departments, including program evaluation. Subsequent guidance that was issued in 2007 to support the 2005 Policy on Management, Resources and Results Structure (MRRS), encouraged department heads of evaluation to provide advice on the Performance Measurement Framework embedded in their organization’s MRRS. The Secretariat views the Framework and the policy requirements as an integrated approach to measuring management performance in departments.
1.8 Focus on expenditure management. More recent developments have placed renewed emphasis on the expenditure management role of program evaluation. As described in its 2007 and 2008 Budget documents, the federal government has introduced a new expenditure management system, a key pillar of which is the ongoing assessment of all direct program spending, known as strategic reviews. Strategic reviews are intended to ensure that programs are effective and efficient, meet the priorities of Canadians, and are aligned with core federal responsibilities. According to the Treasury Board of Canada Secretariat, program evaluation is a key source of information on program effectiveness in support of these reviews.
1.9 The 2009 Policy on Evaluation. The most recent development in federal program evaluation is a new policy that was approved in April 2009. In addition to supporting the renewal of the Expenditure Management System, by improving the information base for strategic reviews of departmental spending, the new policy requires that evaluations cover all direct program spending over a five-year cycle. Each year, departments will have to evaluate an average of 20 percent of their direct program spending to ensure full evaluation coverage, which includes grants and contributions programs. Larger departments and agencies must implement this requirement from 1 April 2013. Following a four-year transition period, the first cycle covers the 2013–14 to 2017–18 fiscal years. Smaller organizations, with fewer than 500 full-time equivalents and a reference level of less than $300 million, are subject to the lesser requirement of ensuring coverage as appropriate. Accompanying the new policy is a directive on evaluation that defines the roles and responsibilities of department officials involved in evaluation, and standards that set minimum requirements for the quality, neutrality, and usefulness of evaluations.
Focus of the audit
1.10 This was a government-wide audit that examined program evaluation in relation to the measurement of program effectiveness—effectiveness evaluation. The overall objective of this audit was to determine whether selected departments and the Treasury Board of Canada Secretariat are meeting the needs for effectiveness evaluation and are identifying and making improvements in effectiveness evaluation.
1.11 The audit covered a five-year period, from the 2004–05 to 2008–09 fiscal years. Focusing on this period allowed us to examine the impact of Secretariat efforts, through a series of its studies, to understand how well the 2001 evaluation policy was working up to the 2004–05 fiscal year. We also examined selected departments’ efforts to address any problems identified over the same period. Because the 2001 evaluation policy was replaced in April 2009, the audit did not focus directly on compliance with either policy.
1.12 Detailed examination work was carried out in the Treasury Board of Canada Secretariat and in six departments:
Agriculture and Agri-food Canada,
Canadian Heritage,
Citizenship and Immigration Canada,
Environment Canada,
Fisheries and Oceans Canada, and
Human Resources and Skills Development Canada.
1.13 More details on the audit objectives, scope, approach, and criteria are in About the Audit at the end of this chapter.
Observations and Recommendations
Meeting needs for effectiveness evaluation
1.14 Well-managed organizations operate according to a management cycle for continuous improvement consisting of planning, doing, checking, and improving. We looked for evidence that the audited departments planned their effectiveness evaluations to meet identified needs, executed these plans, checked to see that needs were met, and made improvements where required. We followed a similar approach in relation to the Treasury Board of Canada Secretariat’s oversight and support role (paragraphs 1.63–1.93).
1.15 We found that, in light of their limited evaluation coverage and reliance on insufficient performance information, departments were not able to demonstrate that they are fully meeting needs for effectiveness evaluation.
The need for effectiveness evaluations continues to grow
1.16 When departments plan their effectiveness evaluation work, they must first identify the needs for these evaluations. Heads of evaluation in the six departments told us that they based those needs on both government-wide requirements and internal corporate risk-based needs.
1.17 The government views evaluation as the primary source of neutral and systematic information on the ongoing relevance and performance of policies and programs. It also expects evaluation to show alternative ways of achieving expected results and program design improvements. Government-wide requirements for effectiveness evaluations are defined both in statute and in policy (Exhibit 1.2).
Exhibit 1.2—Government-wide requirements for effectiveness evaluation appear in both statutes and policies
The 2000 Policy on Transfer Payments (revised in 2008) required departments to review and report on the effectiveness of transfer payments when requesting renewal of program terms and conditions. The Financial Administration Act (amended in 2006) requires departments to conduct a review every five years of the relevance and effectiveness of each ongoing program of grants or contributions for which they are responsible.
The 2001 Treasury Board Evaluation Policy advised that evaluations should consider the relevance, success, and cost-effectiveness of programs.
Approved program funding may be subject to specific conditions in Treasury Board submissions and memoranda to Cabinet requiring the department to evaluate the program and report back to the Treasury Board or Cabinet before a set deadline.
In the 2007 Budget, the Government announced its new Expenditure Management System (EMS), including enhanced requirements for evaluation.
As of April 2007, evaluation of effectiveness is required for selected new federal regulations.
The 2009 Treasury Board Policy on Evaluation requires that all evaluations that are intended to count toward coverage requirements address value for money by including clear and valid conclusions about the relevance and performance of programs.
1.18 According to the Treasury Board of Canada Secretariat, a critical need for effectiveness evaluation arises from the strategic review process in the Expenditure Management System. Organizations are required to conduct strategic reviews of direct program spending every four years to ensure that the government is directing its resources to the highest priority requirements.
1.19 In addition to these government-wide requirements, heads of evaluation identified internal corporate risk-based needs for effectiveness evaluations that respond to departmental priorities and to the priorities of government. These priorities are typically defined through consultation with senior departmental management.
1.20 Between 2004 and 2009, the need for effectiveness evaluations had grown. This was largely due to measures the government brought forward under the Federal Accountability Act in 2006. This Act introduced an amendment to the Financial Administration Act requiring that, every five years, all ongoing non-statutory grant and contribution programs be evaluated or reviewed for relevance and effectiveness.
Departments have systematic processes to plan effectiveness evaluation
1.21 In light of the many needs for effectiveness evaluation and their potential importance in informing decision making, it is critical that departments consider these needs when making program evaluation plans.
1.22 We expected program evaluation plans in the six departments to consider both government-wide requirements and corporate risk-based needs for effectiveness evaluation. We found that all departments had developed risk-based plans during the audited period (some started in the 2004–05 fiscal year and others started in 2005–06). Most of these plans are produced annually, and most of the recent ones cover a five-year period. We found evidence that, to identify corporate priorities, each department consulted its senior management and its various program areas when developing its evaluation plans. We also found that all departments—except Agriculture and Agri-Food Canada and Citizenship and Immigration Canada—had processes in place to identify and track requirements for grant and contribution renewals in Treasury Board submissions.
Expenditure management needs were not adequately considered
1.23 Officials in the six departments told us that the need for strategic review in the Expenditure Management System had not been a key consideration during previous years’ evaluation planning. However, they told us that this need is now being considered when preparing their evaluation plans. To date, of the six departments audited, only Canadian Heritage (in 2007) and Agriculture and Agri-Food Canada (in 2008) have completed a strategic review. Officials at Fisheries and Oceans Canada and at Canadian Heritage informed us that they plan to undertake evaluations in preparation for the next strategic review.
Departments complete most planned evaluations
1.24 To meet the needs for effectiveness evaluations, we expected the six departments to have conducted the evaluations that they had identified in their plans. We found that most of the evaluations planned by the six departments were carried out (Exhibit 1.3). Department officials told us that some evaluations were not completed for a variety of reasons, including program cancellations and redesign, limited data availability, and changes in internal client needs (for example, senior management and program managers) and evaluation capacity. Officials at Human Resources and Skills Development Canada informed us that, because many of the evaluations not completed during the audit were multi-year evaluations with planned completion dates beyond our audit period, they projected a higher completion rate.
Exhibit 1.3—Most of the planned evaluations were completed
Department
Number of planned evaluations*
Number and percentage completed **
Canadian Heritage
48
42 (88%)
Fisheries and Oceans Canada
30
24 (80%)
Environment Canada
29
23 (79%)
Citizenship and Immigration Canada
21
15 (71%)
Agriculture and Agri-Food Canada
25
14 (56%)
Human Resources and Skills Development Canada
64
34 (53%)
* Based on departmental evaluation plans between the 2004–05 and 2007–08 fiscal years.
** Based on listings of evaluations provided by departments in their 2004–05 to 2007–08 plans, which were completed by 30 April 2009.
Evaluation coverage of programs is limited
1.25 While the 2009 Policy on Evaluation requires departments to evaluate all direct program spending, there was no such requirement during the period of our audit. The Treasury Board of Canada Secretariat required departments to conduct evaluations based on risk. We noted that the Secretariat encouraged departments to embed program evaluations into program management and ensure adequate evaluation coverage of programs. The six departments followed systematic processes when they planned evaluations, including effectiveness evaluations. However, within the departments we found that a low proportion of total program expenses were evaluated.
1.26 We examined the data that the six departments provided to the Treasury Board of Canada Secretariat as part of the Annual Capacity Assessment Survey, including information on the annual expenses of evaluated programs. We compared these program expenses to total departmental program expenses for that year. We found wide variation across the six departments in their evaluation coverage of their program expenses (Exhibit 1.4).
Exhibit 1.4—A low proportion of total program expenses were evaluated between the 2004–05 and 2007–08 fiscal years
Department
Estimated average annual percentage of program expenses evaluated
Canadian Heritage
13%
Agriculture and Agri-Food Canada*
11%
Environment Canada
9%
Fisheries and Oceans Canada
8%
Citizenship and Immigration Canada
6%
Human Resources and Skills Development Canada*
5%
* Agriculture and Agri-Food Canada and Human Resources and Skills Development Canada each carried out evaluation work that included several programs. This has increased their coverage.
Sources: Treasury Board of Canada Secretariat Annual Capacity Assessment Survey, descriptive information provided by departments, and Public Accounts of Canada
1.27 Our calculations showed that Canadian Heritage had the highest percentage of program expenses that were evaluated compared with the other audited departments. Notably Canadian Heritage also has the highest percentage of expenses on grant and contribution programs, which amounted to 80 percent of its total expenses in the 2007–08 fiscal year. According to the requirement under the Financial Administration Act, departments have until the 2011–12 fiscal year to evaluate all of their ongoing, non-statutory grant and contribution programs that were in existence in December 2006.
1.28 As well, the requirements for renewals of grants and contributions occur in a concentrated period. For example, officials at Citizenship and Immigration Canada told us that it is challenging for them to meet their evaluation requirements because the majority of grant and contribution programs come up for renewal at the same time, creating a spike in the evaluation unit’s workload. However, the Department indicated that these requirements were met over the period of our audit. We noted that, in the 2009–10 fiscal year, its evaluation unit plans to evaluate about 90 percent of its grant and contribution expenses, leaving little capacity to respond to other needs.
Evaluations do not adequately assess effectiveness
1.29 The quality of evaluation methods is a long-standing concern in the federal government. In 2005, the Treasury Board of Canada Secretariat reviewed the quality of evaluation reports to determine whether they had improved. The review noted that, while evaluation reports had improved in quality since 2002, there remained a “pressing need for further improvement.”
1.30 Evaluation practitioners employ a range of methods. There are no universally accepted minimum standards for effectiveness evaluation. However, both the 2001 and 2009 Treasury Board policies on evaluation refer to the need to collect reliable data to support evaluation findings. The 2001 policy refers to objective data collection and analysis, while the 2009 Standard on Evaluation requires that evaluations be based on “multiple lines of evidence, including evidence produced from both quantitative and qualitative analysis.”
1.31 Both policies recognize that the level of methodological rigour should reflect the intended use of the findings. In addition, in programs with low materiality, less rigorous methods may be appropriate.
1.32 In 2005, the Treasury Board of Canada Secretariat found that evaluations tended to rely mainly on information from stakeholder interviews, file and document reviews, case studies, and surveys. The Secretariat also found that information from ongoing program performance measurement tended to be unavailable or insufficient to effectively support evaluations. According to the Secretariat, this was because few programs had reliable systems for collecting and reporting this information. The Secretariat’s review concluded that the lack of program performance data reduces the overall quality of evaluation reports.
1.33 When data is not available, evaluators may have to collect it themselves, or find alternative sources for the information, and their ability to apply the appropriate methods may be constrained. Using a variety of methods, both quantitative and qualitative, is important to ensure that evaluation methods generate enough reliable evidence. Unreliable data reduces confidence in the usefulness of evaluation measurements.
1.34 We reviewed a sample of 23 evaluations identified by the six departments as addressing effectiveness to determine the types of data collected to support evaluation findings. We found that, of the 23 evaluations, 17 explicitly stated that program performance information was lacking because data was unavailable or was not sufficiently reliable. As a result, in 9 of 17 cases, the evaluations indicated that they were limited in their assessment of program success and effectiveness. Furthermore, in 6 of the 17 cases, the assessment was primarily based on interviews with program staff and stakeholders (Exhibit 1.5).
Exhibit 1.5—Some effectiveness evaluations have insufficient performance information
Agriculture and Agri-Food Canada. The Department’s Prairie Grain Roads Program operated between April 2001 and March 2006 with an average annual budget of $35 million. The program was meant to assist Prairie provinces to address increased pressure on rural roads by improving roads, increasing truck tonnage capacity, and also increasing safety for road users.
Completed in 2006, this $103,600 evaluation was to determine the program’s results and impacts, adequacy of design, continued relevance, and cost-effectiveness. Achieving these objectives was hindered by the program’s lack of performance measures to determine its effectiveness. It was not possible to know, for example, whether road safety had improved as a result of the program.
Claims of improved safety were primarily based on interviews and satisfaction ratings from successful program applicants. If performance measures had been implemented from the outset of the program, more complete conclusions on the effectiveness of this program could have been reached.
Citizenship and Immigration Canada. The Department’s Private Sponsorship of Refugees Program, which began in 1978, is intended to assist refugees to settle and build new lives in Canada through sponsorship by Canadian citizens. The annual budget of the program is approximately $5 million.
The most recent evaluation of this program was done in 2007 at a cost of $268,000. The evaluation was intended to examine the program’s continued relevance, success in achieving outcomes as identified in its results-based management accountability framework, and its cost-effectiveness.
This evaluation used data originating from a wide range of sources, including statistical information from both the department and Statistics Canada. As a result, the evaluation was able to more objectively document the extent to which refugees were finding employment, accommodation, and essential services.
Fisheries and Oceans Canada. In 2006, the Department completed an evaluation of its Program for Sustainable Aquaculture. The program was launched in 2000; it had an annual budget in the 2008–09 fiscal year of $15 million. The objective of this program is to foster growth of a sustainable and competitive aquaculture industry and to increase public confidence in aquaculture.
The stated purpose of this $98,600 evaluation was to examine the relevance, success, and cost-effectiveness of the program. However, the evaluators acknowledged that there was no ongoing monitoring system to track program results. For example, because there was limited information on the impact of aquaculture on human health, it was not possible to determine whether related program objectives were achieved.
Without this information, many of the evaluation’s conclusions were based on interviews with federal officials and industry representatives, as well as on document and file reviews. As a result, this evaluation did not adequately address the effectiveness of the program.
1.35 The heads of evaluation in all six departments confirmed that performance information they needed to evaluate whether programs are cost-effective and are achieving expected results was often insufficient. The development and implementation of ongoing performance measures is the responsibility of program managers, not departmental evaluation units.
1.36 Due to the weaknesses in performance information and the need to apply appropriate evaluation methods, the actual coverage of departmental programs by effectiveness evaluations is even more limited than shown in this audit (Exhibit 1.4). Based on our sample, three quarters of the evaluations were hampered in their assessment of program effectiveness because of inadequate data.
1.37 Recommendation. Agriculture and Agri-food Canada, Canadian Heritage, Citizenship and Immigration Canada, Environment Canada, Fisheries and Oceans Canada, and Human Resources and Skills Development Canada should develop and implement action plans to ensure that ongoing program performance information is collected to support effectiveness evaluation.
Agriculture and Agri-Food Canada’s response. Agreed. The Department agrees that the systematic collection of program performance data by managers is necessary to report on program performance and to support effectiveness evaluations.
As required by the Treasury Board Policy on Transfer Payments, a performance measurement strategy for ongoing management of transfer payment programs, including performance measures and indicators and a data collection strategy, is developed for each new transfer payment program.
The Department’s evaluation function reviews program performance measurement strategies as they are developed to ensure that outcomes are defined, measurable, and attributable and that the strategies, if implemented, are sufficient to support future evaluation work.
Beginning in the 2009–10 fiscal year, the Department will conduct annual state of performance measurement reviews. The first such review will assess performance measurement practices at the Department and the adequacy of data collected for programs soon to be evaluated. An action plan to address its recommendations will be developed and its implementation monitored with a view to strengthening the Department’s practices in this area.
Canadian Heritage’s response. Agreed. The Department is executing its Action Plan for Implementation of the Management, Resources and Results Structures (MRRS) Policy to ensure that program staff are able to fulfill their responsibility for developing and maintaining performance measurement strategies. Action plan measures include
the provision of information sessions and workshops,
the establishment of indicators and corresponding targets,
the development of robust methodologies to demonstrate outcomes,
the establishment of a centre of expertise on performance measurement within the Department,
the design and implementation of adequate tools and guidelines,
the establishment of relevant information technology and systems, and
the regular analysis and reporting of collected data.
This action plan is expected to be completed by end of the 2011–12 fiscal year.
The Department’s Office of the Chief Audit and Evaluation Executive is continually providing advice and support to Department managers in their efforts to implement this action plan. In line with the Transfer Payment Policy, this office is also providing timely advice and support on program design and performance measurement strategies through the review of official approval documents for the creation and renewal of new or existing programs. Finally, as required under the new Evaluation Policy, the office will be submitting, in the 2010–11 fiscal year, its first annual report to the Departmental Evaluation Committee on the state of performance measurement in the Department.
Citizenship and Immigration Canada’s response. Agreed. The Department recognizes the value of ongoing performance information for evaluation and will continue to support related departmental activities. The Department will develop an action plan that includes the renewal of the comprehensive departmental performance measurement framework and program activity architecture, and furthers the integration of the Framework into the business planning process.
Environment Canada’s response. Agreed. The Department accepts this recommendation, which echoes the intent of the 2009 Evaluation Policy related to performance information. As such, actions are already under way within Environment Canada to implement this recommendation. These include ongoing monitoring of the implementation of management responses to previous evaluations that have identified concerns with performance information, and the development and implementation of a strategy to inform all department managers of the Evaluation Policy requirements pertaining to performance measurement.
In addition, the Department’s evaluation plan will be expanded to include a monitoring component to verify, within available resources, the status of performance data collected in the department and whether sufficient performance information will be available to support upcoming evaluations. This monitoring component will be included in the 2010–15 Evaluation Plan and will be updated annually thereafter.
Further, for the 2010–11 fiscal year, the Department’s performance measurement framework has been linked to the Department’s program activity architecture, in that performance measures have been identified for all programs.
Fisheries and Oceans Canada’s response. Agreed. The Department’s performance measurement framework links its core indicators to the departmental program activity architecture (PAA), thus identifying performance measures for all program activities and sub-activities. Each fiscal year, the Department conducts an analysis of the state of performance measurement in the Department and provides an annual report to the Departmental Evaluation Committee. In addition, the Department will develop and implement an action plan to ensure that ongoing program performance information is collected to support effectiveness evaluation by the end of August 2010.
Human Resources and Skills Development Canada’s response. Agreed. The Department accepts this recommendation that echoes the intent of various Treasury Board of Canada Secretariat policies, including the 2009 Evaluation Policy. As such, the Department already gathers and monitors ongoing performance information to support effectiveness evaluation, including
monitoring implementation of management responses to previous evaluations that have identified data issues for current programs;
undertaking early evaluative work in advance of the formal initiation of effectiveness evaluations as part of the evaluation planning process, to review the state of performance data and logic models; and
monitoring the implementation of new programs (for example, Economic Action Plan initiatives) to ensure the necessary administrative and performance data are available to support future evaluation activities.
The Department is also undertaking a comprehensive review, refinement, and validation of its Performance Measurement Framework (PMF) to make it more robust and comprehensive, to support ongoing planning, monitoring, and managing for results and the Evaluation Directorate has been actively involved in this work. The progress made on the departmental PMF will support both performance monitoring as well as evaluation of relevance and effectiveness.
Some quality assurance processes are in place
1.38 Quality assurance processes are designed to ensure that quality requirements are being met. In the case of effectiveness evaluation, quality assurance helps to ensure that reports meet defined standards and provide decision makers with reliable and useful evaluation findings.
1.39 We expected the six departments to demonstrate that they had developed systematic quality assurance processes for their effectiveness evaluations. We found that all six departments had quality assurance processes in place. While these processes were not identical in design, we identified a number of common elements: internal evaluation standards, internal review, and external expert review. All six departments did internal reviews and all but two (Agriculture and Agri-Food Canada and Fisheries and Oceans Canada) used external experts, either routinely or on a selective basis, to review for quality assurance. Despite these quality assurance processes, we found that three quarters of the evaluations in our sample failed to adequately address effectiveness.
1.40 Quality assurance processes may also support the independence of the evaluation unit from program management. Through its 2001 and 2009 evaluation policies, the Treasury Board acknowledged that having a departmental evaluation committee approve evaluation plans and reports supports independence, because these plans and reports are reviewed objectively by department officials who are not program managers.
1.41 We examined the processes that the six departments followed to review and approve evaluation plans and reports. In all six departments, both plans and reports are approved by senior evaluation committees.
1.42 We note that, unlike evaluation committees, the departments’ audit committees are required to have external members. In our view, this is a good practice, because the knowledge and perspectives of practitioners from outside government are being considered. This practice could also have merit for evaluation committees. Under the 2009 Policy on Evaluation, evaluation committees are required to review evaluation plans and reports and recommend their approval by the deputy head. Committees with external members could play a stronger role in the continuous improvement of effectiveness evaluation.
1.43 Recommendation. Agriculture and Agri-food Canada, Canadian Heritage, Citizenship and Immigration Canada, Environment Canada, Fisheries and Oceans Canada, and Human Resources and Skills Development Canada should consider the merits of including external experts on their departmental evaluation committees. The Treasury Board of Canada Secretariat should provide guidance to departments in this regard.
Agriculture and Agri-Food Canada’s response. Agreed. The Department has recently introduced a number of practices to ensure production of strong evaluation reports. They include seeking input from external experts on evaluations in progress. The Department will also consider including external members on its Departmental Evaluation Committee, in the context of guidance provided by Treasury Board of Canada Secretariat.
Canadian Heritage’s response. Agreed. At the Department, oversight of the evaluation function is provided by the Strategic Policy, Planning and Evaluation Committee (SPPEC), chaired by the Deputy Head. This structure provides opportunities for enhanced integration between the policy, planning, and evaluation functions of the Department. As well, the Department already brings key evaluation reports when necessary to its department audit committee for review and discussion. The Department will collaborate with the Treasury Board of Canada Secretariat’s Centre of Excellence in Evaluation to assess the value added of integrating the advice of external evaluation experts to inform the work of departmental evaluation committees.
Citizenship and Immigration Canada’s response. Agreed. The Department will assess the benefits of including an external evaluation expert on the departmental evaluation committee.
Environment Canada’s response. Agreed. The Department accepts the recommendation and will await guidance from the Treasury Board of Canada Secretariat on the inclusion of external members on departmental evaluation committees.
Fisheries and Oceans Canada’s response. Agreed. The Department will consider the merit of including external members on its department evaluation committee, in the context of guidance provided by the Treasury Board of Canada Secretariat.
Human Resources and Skills Development Canada’s response. Agreed. The Department has, in the past, included members from outside the Department on its departmental evaluation committee. The Department will reconsider the formal inclusion of external experts on the current Departmental Evaluation Committee and will look to the Treasury Board of Canada Secretariat for guidance on this part of the Evaluation Policy.
The Treasury Board of Canada Secretariat’s response. Agreed. The Secretariat agrees that it should provide guidance to departments on the possible merits of including external experts on their departmental evaluation committees. These actions will be completed by 31 March 2010.
Departments are not systematically identifying priorities for improvement
1.44 The Treasury Board of Canada Secretariat carries out annual assessments of department management using indicators that measure each of the 10 elements of the Management Accountability Framework. Among these elements are assessments of evaluation coverage, quality, and use. According to the Secretariat, these assessments help deputy heads identify priorities for management improvement. In addition to these assessments, we expected the six departments to have their own internal processes for determining whether they are meeting needs for effectiveness evaluations.
1.45 As noted earlier (paragraph 1.22), we found that all departments have consultation processes aimed at ensuring that their evaluation plans reflect corporate priorities. In addition, we did see other improvements in some departments, although these were not systematic in nature. Only Environment Canada has a formal process in place to systematically identify aspects of its evaluation practice that require improvement. Environment Canada does a number of things to help ensure that its evaluation practice is oriented toward continuous improvement. For example, client feedback is solicited through post-evaluation surveys that provide ongoing feedback on the quality and value of evaluations.
1.46 In addition, when Environment Canada’s evaluations are completed, lessons-learned exercises are often developed and shared with managers of similar programs and initiatives to enhance the overall utility of evaluation findings. The evaluation unit also developed a self-assessment framework for quality assurance that is based on a self-assessment guide for internal audit. The unit adapted some elements in the guide to apply them to evaluation and it also considered existing standards for evaluation. The results of these activities are communicated to the evaluation committee.
1.47 With the exception of Environment Canada, the audited departments could not demonstrate that they have internal processes in place to systematically identify areas for improvement in effectiveness evaluation over the 2004–05 to 2008–09 period. Such a process enables departments to ensure that effectiveness evaluations are following the management cycle for continuous improvement and becoming more useful for making key decisions.
1.48 Recommendation. Agriculture and Agri-food Canada, Canadian Heritage, Citizenship and Immigration Canada, Fisheries and Oceans Canada, and Human Resources and Skills Development Canada should implement systematic processes to determine whether their effectiveness evaluations are meeting government-wide requirements and internal corporate needs, and act on areas identified for improvement. The Treasury Board of Canada Secretariat should monitor and provide any additional support it considers necessary for the implementation of these processes.
Agriculture and Agri-Food Canada’s response. Agreed. The Department accepts this recommendation and notes that in the past year, it has introduced a number of systematic processes, which together will ensure effectiveness evaluations address senior management information needs in a timely manner. They include annual consultations on evaluation priorities, requests for feedback on completed evaluations, and annual Head of Evaluation reports on the performance of the evaluation function.
Canadian Heritage’s response. Agreed. The Department recognizes the need to establish systematic processes to assess whether effectiveness evaluations are addressing needs. The Evaluation Services Directorate is already developing a performance measurement framework and management strategy to identify clear performance expectations and standards for the Department’s evaluation function. A systematic process to collect, analyze, and report on performance data and client satisfaction will be implemented in order to identify areas of improvements. Based on data collected during the first year of implementation (the 2010–11 fiscal year), periodic reporting to the Departmental Evaluation Committee should begin by the 2011–12 fiscal year. This performance information will complement data already reported in the context of the annual management accountability framework assessments conducted by the Treasury Board of Canada Secretariat.
Citizenship and Immigration Canada’s response. Agreed. The Department will assess the need for systematic processes in addition to the Management Accountability Framework assessment process, the oversight of the Departmental Evaluation Committee, requirements for annual reporting, and the Department’s evaluation process, which includes several steps of consultation and feedback from the Department’s branches.
Fisheries and Oceans Canada’s response. Agreed. The recommendation echoes the intent of the 2009 Policy on Evaluation. As such, actions are already under way within the Department to implement a systematic process to determine whether effectiveness evaluations are meeting internal corporate needs and government-wide needs (i.e., a strategic review) and to act on areas identified for improvement. Further work in this area will be completed in the context of guidance provided by the Treasury Board of Canada Secretariat.
Human Resources and Skills Development Canada’s response. Agreed. The Department currently employs a variety of systematic processes to ensure the quality and relevance of its evaluations for both internal and government-wide needs. External peer reviewers and evaluation advisory committees are a mandatory element of the evaluation work to ensure that evaluations are meeting information needs. Further work in systematically determining whether evaluations are meeting government and senior management needs will be developed, building upon the current annual Management Accountability Framework assessment process led by the Treasury Board of Canada Secretariat. Areas for improvement will be identified and reported to the Departmental Evaluation Committee.
The Treasury Board of Canada Secretariat’s response. Agreed. The Secretariat agrees that it should assist departments as necessary in their implementation of processes to determine whether evaluations are meeting government-wide needs and should provide support to departments that it considers necessary.
The new Treasury Board Policy on Evaluation, which came into effect on 1 April 2009, includes specific requirements to enhance evaluation coverage and examination of program effectiveness as well as performance information to support evaluations. The policy calls on the Secretary of the Treasury Board to provide functional leadership for evaluation across government, including monitoring and reporting annually to the Treasury Board on the health of the evaluation function.
The Secretariat currently carries out a large portion of this work through the annual Management Accountability Framework assessment process, which the Office of the Auditor General acknowledges was not covered by the current audit. The Secretariat communicates recommended areas for improvement in evaluation through the assessment reports that it sends to the deputy heads of departments and agencies, who are responsible for the evaluation function in their respective organizations.
Capacity for effectiveness evaluation
1.49 We define capacity as sufficient qualified evaluation staff and funding to meet needs for effectiveness evaluation. We examined staffing and funding for program evaluation in the six departments over the five-year period audited to determine whether they were able to hire enough staff and to address areas for improvement.
1.50 We found that, despite having increased funding and staffing, the audited departments found it challenging to hire enough qualified, experienced evaluation staff to meet needs for effectiveness evaluation, and they had not been able to regularly address areas for improvement.
Funding increases have enhanced departments’ capacity for evaluation
1.51 Evaluation unit funding increased over the audit period in all departments audited except Fisheries and Oceans Canada (Exhibit 1.6). This includes funding from the Treasury Board that is intended to enable departments to implement the 2001 Evaluation Policy, and funding provided following the enactment of the Federal Accountability Act for the evaluation of all ongoing grant and contribution programs.
Exhibit 1.6—Evaluation unit funding increased in most departments
Department
Unit funding in 2004–05
Unit funding in 2008–09
Agriculture and Agri-Food Canada
$1,029,000
$1,894,000
Canadian Heritage
$2,499,000
$3,123,000
Citizenship and Immigration Canada
$650,000
$1,987,000
Environment Canada
$732,000
$1,383,000
Fisheries and Oceans Canada
$1,248,000
$1,162,000
Human Resources and Skills Development Canada
$10,750,000
$13,924,000
Source: Data collection forms completed by departments
Shortage of experienced evaluators continues
1.52 The shortage of experienced program evaluators in the federal government is a long-standing concern. It has been noted in past Office of the Auditor General audits and in diagnostic studies by the Treasury Board of Canada Secretariat, and it was the subject of recent discussions within the federal evaluation community. A 2005 report by the Secretariat Centre of Excellence for Evaluation stated that “[t]he scarcity of evaluation personnel is probably the number one issue facing Heads of Evaluation.”
1.53 We found that during the period covered by this audit, the number of professional staff working in the evaluation units in the six departments had increased substantially (Exhibit 1.7). This trend was also evident in our analysis of descriptive data provided by the remaining large federal departments and agencies, which we did not audit.
Exhibit 1.7—The estimated number of evaluation unit professional staff increased in each department between the 2004–05 and 2008–09 fiscal years
Department
Number of professional staff in 2004–05*
Number of professional staff in 2008–09*
Agriculture and Agri-Food Canada
6.0
11.0
Canadian Heritage
8.1
13.0
Citizenship and Immigration Canada
3.3
12.5
Environment Canada
4.0
10.0
Fisheries and Oceans Canada
4.0
7.0
Human Resources and Skills Development Canada
44.5
54.0
Other large departments
176.3
296.7
* Full-time equivalents
Source: Data collection forms completed by departments
1.54 According to officials in the six departments, despite these increases in both funding and staff, it remains a challenge to find experienced evaluators, particularly at the senior levels. In their view, the shortage of experienced evaluators has affected their ability to hire the people they need. For example, in one collective staffing process, the pool of experienced evaluators was depleted before the demand was met. They also indicated that the shortage of experienced evaluators has led to evaluators being hired away by other federal evaluation units.
Evaluator competencies are not adequately defined
1.55 In diagnostic studies carried out by the Secretariat in 2005, deputy heads of departments identified the shortage of qualified evaluators as contributing to the inconsistent quality of evaluations. Competency profiles can help address this gap, by identifying training and development needs and by informing staffing efforts. We examined whether the six departments had addressed the challenge of finding the right people to do the work, by developing competency profiles for their evaluation unit staff.
1.56 Officials told us that they had begun to develop competency profiles. However, these efforts were discontinued, while the Secretariat undertook related work that recently resulted in a draft competency profile. The development of these profiles has been hindered by the lack of agreement about the required competencies of evaluators.
Other responsibilities of evaluation units put pressure on capacity
1.57 According to the Secretariat, evaluation units have assumed a number of responsibilities in addition to the traditional evaluation studies, including
preparing evaluation planning reports and assessments,
developing results-based management accountability frameworks, and
providing advice and training to program managers on evaluation and performance measurement.
1.58 We interviewed officials in the six audited departments to find out how much time is spent conducting effectiveness evaluations. We found that Environment Canada is the only department that conducts formal time recording of evaluation unit tasks. It was also the only department that was able to provide us with detailed data, from which we determined that about 40 percent of its time was spent on tasks other than evaluation.
1.59 The estimates provided by the other departments indicated that they spent about the same amount of time on such tasks. While we recognize the potential value and importance of these other tasks, they nevertheless have an impact on the capacity of evaluation units to meet identified needs for effectiveness evaluation.
Departments use contractors extensively
1.60 Another diagnostic study reported that a sample of deputy heads thought that evaluation units could make a more significant contribution and could strengthen staff capacity, by conducting more evaluations in-house. According to this study, deputy heads were concerned that contracting out evaluation studies prevents evaluation units from becoming their department’s subject matter experts, since much of what is learned remains with the contractor and not with the unit.
1.61 We looked at the evaluations conducted by the six audited departments during the audited period, between the 2004–05 and 2008–09 fiscal years, to determine whether they were conducted by contractors, in-house employees, or both. While about 90 percent of the evaluations were wholly or partially conducted by contractors (Exhibit 1.8), this varied among the audited departments. In Fisheries and Oceans Canada, for example, the figure was 37 percent, while in other departments the figures were close to 100 percent. This pattern was also evident in our analysis of descriptive data provided by the remaining large federal departments and agencies, which we did not audit.
Exhibit 1.8—Evaluations conducted by contractors in whole or in part in the audited departments
*Contractors’ data includes evaluations they conducted both in whole or in part.
1.62 Although officials recognized the value of developing in-house capacity, they also informed us that they required contractors for specific technical or subject matter expertise that was not feasible to maintain in-house.
Oversight and support
1.63 The Treasury Board of Canada Secretariat has described oversight as one of its central agency roles. Oversight includes policy development, monitoring, and reporting on management and budgetary performance within government. The Secretariat is responsible for the oversight of management policy development and the financial oversight of expenditure management. The Secretariat’s other central agency roles are leadership in setting the agenda for management, and helping departments and agencies improve their performance.
1.64 Under the 2001 Evaluation Policy, the Secretariat was required to provide central direction for the evaluation function by
establishing a Centre of Excellence for Evaluation to provide leadership, guidance, and support to the practice of evaluation;
using evaluation results, where appropriate, in decision making at the Centre;
setting standards; and
monitoring evaluation capacity in the government.
The Secretariat has identified a number of improvements in evaluation
1.65 We expected the Secretariat to support government-wide evaluation practices by identifying needed improvements and determining and carrying out actions required of the Secretariat, to help to ensure that departments and agencies have the tools they need to achieve the desired results.
1.66 In 2004, the Secretariat did an interim evaluation of the 2001 Evaluation Policy and found gaps in budget and human resources. This evaluation called for the Centre of Excellence for Evaluation to play a leadership role in helping the evaluation community, by
advocating the importance of evaluation to senior managers for decision making;
continuing to help with system-wide capacity building;
continuing to develop training, tools, and guides to support policy implementation; and
identifying best practices for the evaluation community.
1.67 Between the 2004–05 and 2006–07 fiscal years, the Secretariat also carried out several diagnostic studies that included interviews with deputy heads, clients, and stakeholders, as well as inquiries into the professionalism and the overall role of evaluation in the federal government. These studies identified necessary improvements.
1.68 We noted a number of specific initiatives related to evaluation and diagnostic studies, including new guidance and help given to departments for recruiting and training evaluators.
The Secretariat carried out extensive monitoring of the evaluation function
1.69 By the 2004–05 fiscal year, the Secretariat had developed several monitoring tools to collect information about the evaluation function across government. Secretariat officials identified similar monitoring activities in 2009:
annual capacity assessment survey;
Evaluation Information Resource Component—a database;
periodic in-depth review of the quality of evaluation reports;
ongoing reviews of evaluations, results-based management accountability frameworks, and departmental evaluation plans;
department and agency visits and interaction;
feedback to individual entities;
communication of best practices in evaluation methodology and reporting, and in managing the evaluation function;
review of the evaluation content of departmental Estimates documents (Report on Plans and Priorities and the Departmental Performance Report); and
Management Accountability Framework assessments.
1.70 The Secretariat describes the Management Accountability Framework as one of several tools it uses to assess management performance in departments. Although we did not audit the Framework, we noted that the Secretariat has used it in several ways, including changing the Framework itself, for example, by refining the assessment ratings. Program sector analysts and analysts at the Centre of Excellence for Evaluation also cited the knowledge gained from the Framework’s assessments as potentially helpful, as it could be used to improve guidance and tools, for example, in the development of the Standard on Evaluation that was issued with the 2009 Policy on Evaluation. However, we also noted that, because this knowledge was not always documented, its impact was not always clear.
1.71 The Secretariat’s first (and, to date, only) published report on the evaluation function—The Health of the Evaluation Function in the Government of Canada Report for Fiscal Year 2004–05—was based on the first of the annual capacity assessment surveys. These surveys are used to collect information on evaluation infrastructure, resources, production, and results and, for larger departments, to collect information on evaluation planning and resource requirements. The Secretariat has conducted the capacity assessment survey every year since the 2004–05 fiscal year.
1.72 The diagnostic studies, the Health of the Evaluation Function Report, and ongoing monitoring provided an analytical base for the first Government of Canada Evaluation Plan, 2005–06, which was developed in September 2005. Once again, this was the only one of its kind; no other such plan appeared in the period ending 31 March 2009.
1.73 The Secretariat carried out extensive monitoring. The value of this monitoring, of annually reporting on the evaluation function’s health (see paragraph 1.89), and of developing a government-wide plan, were recognized as requirements in the 2009 Policy on Evaluation, as well as in earlier drafts of the policy. However, over the period of our audit, the Secretariat developed only one such report and plan. For the purposes of continuous improvement, it will be important for the Secretariat to pursue these oversight activities, with departments, on a regular and systematic basis.
Sustained support for effectiveness evaluation is lacking
1.74 The Treasury Board’s 2001 Policy on Evaluation advised that evaluations consider program relevance, success, and cost-effectiveness. We therefore expected the Treasury Board of Canada Secretariat to provide guidance to departments and to help them identify these elements and improve effectiveness evaluation.
1.75 One area the Secretariat has clearly identified is the need to improve the use of evaluation for expenditure review. In 2005, a discussion document from the President of the Treasury Board, Management in the Government of Canada: A Commitment to Continuous Improvement, expressed concern that evaluation must become more directly linked to decisions regarding resource allocation.
1.76 In our audit of the expenditure management system (EMS) (November 2006 Report of the Auditor General of Canada, Chapter 1—Expenditure Management System at the Government Centre), we recommended a systematic review of the relevance and value-for-money of ongoing programs. The Public Accounts Committee pursued this issue in 2008 by recommending that the Secretariat develop an action plan to hire and train the necessary evaluators and that it reinforce the importance of evaluation as a key requirement in the EMS.
1.77 The government accepted these recommendations and launched the new expenditure management system that included strategic review. It viewed evaluation as the primary source of neutral and systematic information on the ongoing relevance and performance of polices and programs. In light of these developments, we looked for the Secretariat’s actions and initiatives in support of effectiveness evaluation.
1.78 We noted an initiative to develop a value-for-money tool for evaluation, launched as a pilot project in 2006. However, this initiative came to an abrupt halt in 2008 and, over the period of our audit, did not move beyond the pilot stage. The Secretariat found that the tool was being used to meet minimum data requirements, rather than the new policy’s objective of examining effectiveness issues. As we completed our audit, Secretariat officials informed us that the project is continuing its development and is being revised to take into consideration the requirements of the new policy on evaluation.
1.79 The Centre of Excellence for Evaluation has undertaken a number of similar initiatives related to capacity development, but during our audit, we found that the results of these initiatives are unclear. We noted policy requirements and related initiatives for evaluators to support the measurement of program performance. However, we did not find that the Secretariat had made progress in developing tools to help departments address the long-standing problem of insufficient data for the evaluation of effectiveness.
1.80 While the Secretariat recognized the value of effectiveness evaluation, particularly for expenditure management, it did not issue, over the period of our audit, adequate guidance or tools to support effectiveness evaluation.
1.81 The renewal of the 2001 Policy on Evaluation took far longer than expected. We found documentation from 2006, 2007, and 2008 indicating the policy would be completed in each of these years. In fact, the new policy took effect only on 1 April 2009. However, at the time of our audit, despite the three years that were spent developing the policy, the Secretariat had not issued guidance to departments on its implementation.
1.82 Recommendation. In developing tools, guidance, and support for departments, the Treasury Board of Canada Secretariat should regularly identify gaps that it needs to act on, develop plans to address these gaps, and act on these plans.
The Treasury Board of Canada Secretariat’s response. Agreed. The Secretariat is presently developing guidance for departments and agencies to support the implementation of the new Treasury Board Policy on Evaluation (April 2009). Guidance in a number of key areas, including departmental evaluation planning and performance measurement strategies, is expected to be available to departments and agencies by 31 March 2010. Further guidance will be issued over the course of the 2010–11 fiscal year.
During the past two years, the Treasury Board of Canada Secretariat has issued guidance on the development of performance measurement frameworks required under the 2005 Policy on Management Resources and Results Structures, which was not covered by the current audit. In addition, through its annual Management Accountability Framework assessments, the Secretariat provides advice and support to departments on the quality and use of evaluation as well as on the quality of performance measurement frameworks associated with departmental Management, Resources and Results Structures.
The Secretariat continually consults with departments and agencies on their needs and monitors policy implementation to identify weaknesses within the government-wide evaluation function. Where the Secretariat determines a need to develop further tools, guidance, or other supports, it includes these activities in its business plans.
Oversight and support require experienced staff
1.83 We examined whether the Treasury Board of Canada Secretariat had the human and financial resources needed for government-wide oversight and support of the program evaluation function.
1.84 The Centre of Excellence for Evaluation’s major responsibilities include
renewing the evaluation policy;
acting as a policy centre, by providing advice and analysis to support the expenditure management system;
conducting Management Accountability Framework assessments of the evaluation function; and
providing support for the evaluation function across government.
1.85 In addition, the Centre monitors the evaluation policy. Many of these tasks are analytical and call for experienced personnel. The review of evaluations, program accountability frameworks, and submissions requires sufficient expertise to provide recommendations and guidance to program sector analysts as well as to the Treasury Board itself.
1.86 We compiled information on the Centre’s workload and compared it to the resources allocated over the audit period. The funding for oversight work is largely salary-based. For the period of our audit, the staff complement of the Centre varied in size, from only 8 in the 2005–06 and 2006–07 fiscal years to 12 in the 2008–09 fiscal year. Overall, the staff levels during the 2005–06 to 2008–09 period were lower than the 15 on staff in the 2004–05 fiscal year, even though the Centre workload pertaining to its oversight activities was increasing. For example, the number of Treasury Board submissions reviewed by the Centre almost doubled from the 2004–05 to 2008–09 fiscal years.
1.87 The Secretariat clearly requires experienced analysts with appropriate expertise in evaluation in order to meet workload demands. The limited number of staff allocated to these functions during the audited period may have contributed to the lack of sustained support for effectiveness evaluation.
1.88 Recommendation. The Treasury Board of Canada Secretariat should ensure that it allocates sufficient resources to tasks that require evaluation expertise.
The Treasury Board of Canada Secretariat’s response. Agreed. In renewing the Policy on Evaluation, the Government has strengthened its commitment to evaluating the value for money of federal programs and reaffirmed the Secretariat’s role of leading the evaluation function. In performing its functional leadership role established in the new policy, the Treasury Board of Canada Secretariat will ensure that the resources necessary for performing this role are considered and sufficient resources are allocated at the Secretariat to tasks that require evaluation expertise.
Care is needed in the implementation of the new coverage requirements
1.89 By 2006, the Secretariat had completed its diagnostic work and issued its Health of the Evaluation Function Report. The Secretariat knew the challenges for the evaluation function associated with the legal requirement that all ongoing grant and contribution programs (a part of direct program spending) be evaluated, and that departments were hard pressed to meet it. The requirement was especially challenging for the audited departments that have a high proportion of such programs, but additional central funding was provided in 2007 to support meeting this coverage requirement. In particular, as the Secretariat itself noted, requiring full coverage made it more difficult to target evaluation efforts on the basis of risks. When the legal requirement was enacted in 2006, it created pressure for 100 percent coverage of all direct program spending, in parallel with the later reform of the expenditure management system.
1.90 We noted that deputy heads of departments remain responsible for implementing the 2009 evaluation policy, including the expanded coverage requirements. It will be important for the Secretariat to work with departments to ensure that they are fully prepared to implement these coverage requirements, in order to meet the expectations set out in the new evaluation policy.
1.91 The implementation of the new coverage requirement faces serious challenges. Earlier requirements for full coverage were never met. Current legal requirements for effectiveness evaluation of all grants and contributions programs have been difficult to meet, and department officials told us that they have concerns about their capacity to respond to these requirements. Moreover, we found a shortage of experienced evaluators and extensive use of contractors over the period audited. For example, Environment Canada estimated that it would have to double the complement of its evaluation unit over the next four years, or sacrifice evaluation depth in order to achieve full coverage.
1.92 In our view, it will be important for the Secretariat and departments to carry out effectiveness evaluation of programs that are susceptible to significant change because of shifting priorities and circumstances. These are programs where evaluations of the relevance, impact, and achievement of objectives can be put to best use. During the transition to full coverage, these programs may present the biggest opportunities for effectiveness evaluation.
1.93 Recommendation. The Treasury Board of Canada Secretariat should help departments prepare to implement the new coverage requirements. During the transition period, the Secretariat should provide advice and guidance for effectiveness evaluation, focusing on programs where such evaluation can be put to best use.
The Treasury Board of Canada Secretariat’s response. Agreed. The Secretariat is planning to issue written guidance for making risk-based choices for evaluation coverage to support departments during the transition period. This guidance is expected by 31 March 2010.
Throughout the transition period, the Secretariat will also help departments prepare to implement the new coverage requirements that come into effect after 31 March 2013. The Secretariat will provide leadership in the development and sharing of effective evaluation practices across departments, as well as support capacity-building initiatives in the evaluation function government-wide.
Conclusion
1.94 The six departments included in this audit—Agriculture and Agri-food Canada, Canadian Heritage, Citizenship and Immigration Canada, Environment Canada, Fisheries and Oceans Canada, and Human Resources and Skills Development Canada—followed systematic processes to plan their effectiveness evaluations. As well, most planned evaluations were completed. However, the evaluations that were conducted by these departments only covered a low proportion of overall departmental expenses, and most of the evaluations we examined were hampered by inadequate data. As a result, departments were not able to demonstrate that they were sufficiently meeting needs for effectiveness evaluation.
1.95 Based on a sample of effectiveness evaluations, we found that the audited departments often did not have the necessary performance information to evaluate whether programs are effective.
1.96 Moreover, with the exception of Environment Canada, which has processes in place to identify needed improvements, the audited departments did not demonstrate that they had regularly identified and addressed areas for improvement in effectiveness evaluation during the audit period. Such a cycle of continuous improvement would steadily add value to effectiveness evaluation.
1.97 The departments we examined expressed concerns about their capacity to implement evaluation of all direct program spending, as required under the 2009 Policy on Evaluation. Even before these expanded requirements, they found it challenging to hire enough experienced evaluators to fully meet needs for effectiveness evaluation, and they had not been able to regularly address areas for improvement. In our view, identifying programs where effectiveness information can be put to the best use will be a key part of implementing the coverage requirements of this policy.
1.98 Over the past five years, the Treasury Board of Canada Secretariat has introduced initiatives to address improvements in evaluation. However, support for effectiveness evaluation, which is an important area for the Secretariat, did not receive sustained attention. While the Secretariat did regularly identify areas for improvement, it did not provide adequate guidance.
1.99 Overall, we found that, in the six departments we audited, needs for effectiveness evaluation were not being adequately met. Improvements are required in departments, and in the oversight and support activities of Treasury Board of Canada Secretariat, in order to remedy the situation.
1.100 These findings are similar to many reported by the Office in previous audits of program evaluation. Taken together, they raise basic questions about effectiveness evaluation in the federal government.
1.101 In our view, the federal evaluation function is at a crossroads. A vital public purpose is served when effectiveness evaluation informs the important decisions that Canadians are facing. Departments face greater expectations than ever before and are taking on added responsibilities. Much remains to be done to meet the challenge. Continuous improvement is the way forward.
About the Audit
All of the audit work in this chapter was conducted in accordance with the standards for assurance engagements set by The Canadian Institute of Chartered Accountants. While the Office adopts these standards as the minimum requirement for our audits, we also draw upon the standards and practices of other disciplines.
Objectives
The overall objective of this audit was to determine whether selected departments and the Treasury Board of Canada Secretariat are meeting the needs for effectiveness evaluation and are identifying and making improvements in effectiveness evaluation.
The audit objectives for the three lines of enquiry were as follows:
Determine whether selected departments can demonstrate that they are meeting needs for effectiveness evaluation and regularly identify and address areas for improvement.
Determine whether selected departments can demonstrate that they have the capacity to meet key needs for effectiveness evaluation and regularly identify and address areas for improvement.
Determine whether the Treasury Board of Canada Secretariat’s government-wide oversight of the program evaluation function has regularly identified and addressed areas for improvement that ensure that departments have the capacity to meet needs for effectiveness evaluation.
Scope and approach
Focus on effectiveness evaluation. Based on the Auditor General Act, section 7(2)(e), this audit examined program evaluation government-wide, in relation to the measurement of program effectiveness (meaning the assessment of the extent to which programs are relevant, produce impacts and outcomes, achieve their objectives, and are cost effective). Other types of evaluation examine program implementation and management.
Not a compliance audit. Because the 2001 Treasury Board evaluation policy was replaced in April 2009, the audit did not focus on compliance with policy.
Evaluation quality. We examined evaluation quality by determining whether departments had processes in place, including quality assurance, to ensure that their effectiveness evaluations are appropriate for their intended uses.
Strategic review and the Expenditure Management System. In view of the link between the Expenditure Management System and evaluation (that is, evaluation is seen as a key source of information for strategic review), the audit sought to determine whether evaluations met the need created by strategic review.
Selection of entities. The entities selected for examination in the audit were the Treasury Board of Canada Secretariat and the following six departments:
Agriculture and Agri-Food Canada,
Canadian Heritage,
Citizenship and Immigration Canada,
Environment Canada,
Fisheries and Oceans Canada, and
Human Resources and Skills Development Canada.
The selection of departments was based on factors such as materiality, range of program types, nature of the evaluation function, and management accountability framework (MAF) ratings and whether a strategic review had been carried out. Our audit also included descriptive information provided by other large departments (those participating in the annual MAF process) who attested to the accuracy of information they provided.
During our audit, we conducted interviews, reviewed files and documents, analyzed descriptive information provided and attested to by large departments, and met with focus groups who provided us with an informed stakeholder perspective.
Period covered by the audit
The audit period was between the 2004–05 and 2008–09 fiscal years. This period was chosen because enough time would have elapsed since the introduction of the 2001 policy to allow its effects to take hold. In addition, the 2004–05 fiscal year was when the Treasury Board of Canada Secretariat carried out an interim evaluation of the 2001 policy. It was also the year when the Secretariat engaged contractors to complete a series of diagnostic studies aimed at understanding the “state of play” of the function at that time. Focusing on the period between the 2005–06 and 2008–09 fiscal years allowed us to examine the impact of the Secretariat’s efforts to understand how well the 2001 policy was working, and to examine how the Secretariat identified problems between the 2002–03 and 2004–05 fiscal years and addressed these problems between the 2004–05 and 2008–09 fiscal years.
Audit work for this chapter was substantially completed on 31 May 2009.
Criteria
Listed below are the criteria that were used to conduct this audit and their sources.
Criteria
Sources
We expected that departments could demonstrate that program evaluation plans take appropriate account of needs for effectiveness evaluation.
Auditor General Act, section 7(2)(e)
Financial Administration Act, section 42(1)
Treasury Board Evaluation Policy (2001)
Treasury Board of Canada Secretariat, Centre of Excellence for Evaluation, Evaluation Function in the Government of Canada (2004), Appendix 2, Evaluation Standards in the Government of Canada
Treasury Board of Canada Secretariat, Centre of Excellence for Evaluation, A Guide to Developing a Risk-Based Departmental Evaluation Plan (2005), section 3.2
Treasury Board Policy on Transfer Payments (2008)
We expected that departments could demonstrate that they have acted on program evaluation plans to meet key needs.
Treasury Board Evaluation Policy (2001)
Treasury Board of Canada Secretariat, Centre of Excellence for Evaluation, A Guide to Developing a Risk-Based Departmental Evaluation Plan (2005), page 14
We expected that departments could demonstrate that their effectiveness evaluations appropriately meet identified needs.
Treasury Board Evaluation Policy (2001)
Treasury Board of Canada Secretariat, Centre of Excellence for Evaluation, Evaluation Function in the Government of Canada (2004), Appendix 2, Evaluation Standards in the Government of Canada
We expected that departments could demonstrate that they regularly identify and act on required improvements in meeting needs for effectiveness evaluation.
Treasury Board Evaluation Policy (2001)
Treasury Board of Canada Secretariat, Centre of Excellence for Evaluation, Evaluation Function in the Government of Canada (2004), Appendix 2, Evaluation Standards in the Government of Canada
We expected that departments could demonstrate reasonable efforts to ensure sufficient qualified evaluation staff to meet key needs for effectiveness evaluation.
Treasury Board Evaluation Policy (2001)
Treasury Board of Canada Secretariat, Centre of Excellence for Evaluation, Evaluation Function in the Government of Canada (2004), Appendix 2, Evaluation Standards in the Government of Canada
Treasury Board of Canada Secretariat, People Component of the Management Accountability Framework (PCMAF) (2005), page 1
Public Service Commission of Canada, Staffing Management Accountability Framework (SMAF) (2005), page 4
Government Response to the Fourth Report on the Standing Committee on Public Accounts: The Expenditure Management System at the Government Centre and the Expenditure Management System in Departments (2008), page 8
We expected departments could demonstrate that the amount and the time frame of funding for effectiveness evaluation meet key needs.
Treasury Board of Canada Secretariat, Centre of Excellence for Evaluation, A Guide to Developing a Risk-Based Departmental Evaluation Plan (2005), pages 5 and 13
We expected that departments could demonstrate that evaluators have sufficient independence from program managers and that their objectivity is not hindered.
Treasury Board Evaluation Policy (2001)
Treasury Board of Canada Secretariat, Centre of Excellence for Evaluation, Evaluation Function in the Government of Canada (2004), Appendix 2, Evaluation Standards in the Government of Canada
We expected that departments could demonstrate that they regularly identify and act on required improvements to capacity to meet needs for effectiveness evaluation.
Treasury Board Evaluation Policy (2001)
Public Service Commission of Canada, Staffing Management Accountability Framework (SMAF), page 5
We expected that the Treasury Board of Canada Secretariat has the resources required for government-wide oversight of the program evaluation function.
Financial Administration Act, section 6(7)
We expected that the Treasury Board of Canada Secretariat could support the practice of government-wide evaluation by identifying needed improvements and determining and carrying out actions required of the Secretariat to help ensure that departments and agencies have the tools they need to achieve the desired results.
Treasury Board Evaluation Policy (2001)
The Standing Committee on Public Accounts, Report on the Expenditure Management System at the Government Centre and the Expenditure Management System in Departments (2008), page 16
Management reviewed and accepted the suitability of the criteria used in the audit.
Audit team
Assistant Auditor General: Neil MaxwellPrincipal: Tom WilemanLead Director: Colin MeredithDirectors: Doreen DeveenLeslie Levita
Irene AndayoHelene CharestJeff GrahamKrista HilgeChandrawattie Samaroo
For information, please contact Communications at 613-995-3708 or 1-888-761-5953 (toll-free).
Appendix—List of recommendations
The following is a list of recommendations found in Chapter 1. The number in front of the recommendation indicates the paragraph where it appears in the chapter. The numbers in parentheses indicate the paragraphs where the topic is discussed.
Recommendation
Response
Meeting needs for effectiveness evaluation
1.37 Agriculture and Agri-food Canada, Canadian Heritage, Citizenship and Immigration Canada, Environment Canada, Fisheries and Oceans Canada, and Human Resources and Skills Development Canada should develop and implement action plans to ensure that ongoing program performance information is collected to support effectiveness evaluation. (1.14–1.36)
Agriculture and Agri-Food Canada’s response. Agreed. The Department agrees that the systematic collection of program performance data by managers is necessary to report on program performance and to support effectiveness evaluations.
As required by the Treasury Board Policy on Transfer Payments, a performance measurement strategy for ongoing management of transfer payment programs, including performance measures and indicators and a data collection strategy, is developed for each new transfer payment program.
The Department’s evaluation function reviews program performance measurement strategies as they are developed to ensure that outcomes are defined, measurable, and attributable and that the strategies, if implemented, are sufficient to support future evaluation work.
Beginning in the 2009–10 fiscal year, the Department will conduct annual state of performance measurement reviews. The first such review will assess performance measurement practices at the Department and the adequacy of data collected for programs soon to be evaluated. An action plan to address its recommendations will be developed and its implementation monitored with a view to strengthening the Department’s practices in this area.
Canadian Heritage’s response. Agreed. The Department is executing its Action Plan for Implementation of the Management, Resources and Results Structures (MRRS) Policy to ensure that program staff are able to fulfill their responsibility for developing and maintaining performance measurement strategies. Action plan measures include
the provision of information sessions and workshops,
the establishment of indicators and corresponding targets,
the development of robust methodologies to demonstrate outcomes,
the establishment of a centre of expertise on performance measurement within the Department,
the design and implementation of adequate tools and guidelines,
the establishment of relevant information technology and systems, and
the regular analysis and reporting of collected data.
This action plan is expected to be completed by end of the 2011–12 fiscal year.
The Department’s Office of the Chief Audit and Evaluation Executive is continually providing advice and support to Department managers in their efforts to implement this action plan. In line with the Transfer Payment Policy, this office is also providing timely advice and support on program design and performance measurement strategies through the review of official approval documents for the creation and renewal of new or existing programs. Finally, as required under the new Evaluation Policy, the office will be submitting, in the 2010–11 fiscal year, its first annual report to the departmental evaluation committee on the state of performance measurement in the Department.
Citizenship and Immigration Canada’s response. Agreed. The Department recognizes the value of ongoing performance information for evaluation and will continue to support related departmental activities. The Department will develop an action plan that includes the renewal of the comprehensive departmental performance measurement framework and program activity architecture, and furthers the integration of the Framework into the business planning process.
Environment Canada’s response. Agreed. The Department accepts this recommendation, which echoes the intent of the 2009 Evaluation Policy related to performance information. As such, actions are already under way within Environment Canada to implement this recommendation. These include ongoing monitoring of the implementation of management responses to previous evaluations that have identified concerns with performance information, and the development and implementation of a strategy to inform all department managers of the Evaluation Policy requirements pertaining to performance measurement.
In addition, the Department’s evaluation plan will be expanded to include a monitoring component to verify, within available resources, the status of performance data collected in the department and whether sufficient performance information will be available to support upcoming evaluations. This monitoring component will be included in the 2010–15 Evaluation Plan and will be updated annually thereafter.
Further, for the 2010–11 fiscal year, the Department’s performance measurement framework has been linked to the Department’s program activity architecture, in that performance measures have been identified for all programs.
Fisheries and Oceans Canada’s response. Agreed. The Department’s performance measurement framework links its core indicators to the departmental program activity architecture (PAA), thus identifying performance measures for all program activities and sub-activities. Each fiscal year, the Department conducts an analysis of the state of performance measurement in the Department and provides an annual report to the Departmental Evaluation Committee. In addition, the Department will develop and implement an action plan to ensure that ongoing program performance information is collected to support effectiveness evaluation by the end of August 2010.
Human Resources and Skills Development Canada’s response. Agreed. The Department accepts this recommendation that echoes the intent of various Treasury Board of Canada Secretariat policies, including the 2009 Evaluation Policy. As such, the Department already gathers and monitors ongoing performance information to support effectiveness evaluation, including
monitoring implementation of management responses to previous evaluations that have identified data issues for current programs;
undertaking early evaluative work in advance of the formal initiation of effectiveness evaluations as part of the evaluation planning process, to review the state of performance data and logic models; and
monitoring the implementation of new programs (for example, Economic Action Plan initiatives) to ensure the necessary administrative and performance data are available to support future evaluation activities.
The Department is also undertaking a comprehensive review, refinement, and validation of its Performance Measurement Framework (PMF) to make it more robust and comprehensive, to support ongoing planning, monitoring, and managing for results and the Evaluation Directorate has been actively involved in this work. The progress made on the departmental PMF will support both performance monitoring as well as evaluation of relevance and effectiveness.
1.43 Agriculture and Agri-food Canada, Canadian Heritage, Citizenship and Immigration Canada, Environment Canada, Fisheries and Oceans Canada, and Human Resources and Skills Development Canada should consider the merits of including external experts on their departmental evaluation committees. The Treasury Board of Canada Secretariat should provide guidance to departments in this regard. (1.38–1.42)
Agriculture and Agri-Food Canada’s response. Agreed. The Department has recently introduced a number of practices to ensure production of strong evaluation reports. They include seeking input from external experts on evaluations in progress. The Department will also consider including external members on its Departmental Evaluation Committee, in the context of guidance provided by Treasury Board of Canada Secretariat.
Canadian Heritage’s response. Agreed. At the Department, oversight of the evaluation function is provided by the Strategic Policy, Planning and Evaluation Committee (SPPEC), chaired by the Deputy Head. This structure provides opportunities for enhanced integration between the policy, planning, and evaluation functions of the department. As well, the Department already brings key evaluation reports when necessary to its department audit committee for review and discussion. The Department will collaborate with the Treasury Board of Canada Secretariat’s Centre of Excellence in Evaluation to assess the value added of integrating the advice of external evaluation experts to inform the work of departmental evaluation committees.
Citizenship and Immigration Canada’s response. Agreed. The Department will assess the benefits of including an external evaluation expert on the departmental evaluation committee.
Environment Canada’s response. Agreed. The Department accepts the recommendation and will await guidance from the Treasury Board of Canada Secretariat on the inclusion of external members on departmental evaluation committees.
Fisheries and Oceans Canada’s response. Agreed. The Department will consider the merit of including external members on its department evaluation committee, in the context of guidance provided by the Treasury Board of Canada Secretariat.
Human Resources and Skills Development Canada’s response. Agreed. The Department has, in the past, included members from outside the Department on its departmental evaluation committee. The Department will reconsider the formal inclusion of external experts on the current Departmental Evaluation Committee and will look to the Treasury Board of Canada Secretariat for guidance on this part of the Evaluation Policy.
The Treasury Board of Canada Secretariat’s response. Agreed. The Secretariat agrees that it should provide guidance to departments on the possible merits of including external experts on their departmental evaluation committees. These actions will be completed by 31 March 2010.
1.48 Agriculture and Agri-food Canada, Canadian Heritage, Citizenship and Immigration Canada, Fisheries and Oceans Canada, and Human Resources and Skills Development Canada should implement systematic processes to determine whether their effectiveness evaluations are meeting government-wide requirements and internal corporate needs, and act on areas identified for improvement. The Treasury Board of Canada Secretariat should monitor and provide any additional support it considers necessary for the implementation of these processes. (1.44–1.47)
Agriculture and Agri-Food Canada’s response. Agreed. The Department accepts this recommendation and notes that in the past year, it has introduced a number of systematic processes, which together will ensure effectiveness evaluations address senior management information needs in a timely manner. They include annual consultations on evaluation priorities, requests for feedback on completed evaluations, and annual Head of Evaluation reports on the performance of the evaluation function.
Canadian Heritage’s response. Agreed. The Department recognizes the need to establish systematic processes to assess whether effectiveness evaluations are addressing needs. The Evaluation Services Directorate is already developing a performance measurement framework and management strategy to identify clear performance expectations and standards for the Department’s evaluation function. A systematic process to collect, analyze, and report on performance data and client satisfaction will be implemented in order to identify areas of improvements. Based on data collected during the first year of implementation (the 2010–11 fiscal year), periodic reporting to the Departmental Evaluation Committee should begin by the 2011–12 fiscal year. This performance information will complement data already reported in the context of the annual management accountability framework assessments conducted by the Treasury Board of Canada Secretariat.
Citizenship and Immigration Canada’s response. Agreed. The Department will assess the need for systematic processes in addition to the Management Accountability Framework assessment process, the oversight of the Departmental Evaluation Committee, requirements for annual reporting, and the Department’s evaluation process, which includes several steps of consultation and feedback from the Department’s branches.
Fisheries and Oceans Canada’s response. Agreed. The recommendation echoes the intent of the 2009 Policy on Evaluation. As such, actions are already under way within the Department to implement a systematic process to determine whether effectiveness evaluations are meeting internal corporate needs and government-wide needs (i.e., a strategic review) and to act on areas identified for improvement. Further work in this area will be completed in the context of guidance provided by the Treasury Board of Canada Secretariat.
Human Resources and Skills Development Canada’s response. Agreed. The Department currently employs a variety of systematic processes to ensure the quality and relevance of its evaluations for both internal and government-wide needs. External peer reviewers and evaluation advisory committees are a mandatory element of the evaluation work to ensure that evaluations are meeting information needs. Further work in systematically determining whether evaluations are meeting government and senior management needs will be developed, building upon the current annual Management Accountability Framework assessment process led by the Treasury Board of Canada Secretariat. Areas for improvement will be identified and reported to the Departmental Evaluation Committee.
The Treasury Board of Canada Secretariat’s response. Agreed. The Secretariat agrees that it should assist departments as necessary in their implementation of processes to determine whether evaluations are meeting government-wide needs and should provide support to departments that it considers necessary.
The new Treasury Board Policy on Evaluation, which came into effect on 1 April 2009, includes specific requirements to enhance evaluation coverage and examination of program effectiveness as well as performance information to support evaluations. The policy calls on the Secretary of the Treasury
Board to provide functional leadership for evaluation across government, including monitoring and reporting annually to the Treasury Board on the health of the evaluation function.
The Secretariat currently carries out a large portion of this work through the annual Management Accountability Framework assessment process, which the Office of the Auditor General acknowledges was not covered by the current audit. The Secretariat communicates recommended areas for improvement in evaluation through the assessment reports that it sends to the deputy heads of departments and agencies, who are responsible for the evaluation function in their respective organizations.
Oversight and support
1.82 In developing tools, guidance, and support for departments, the Treasury Board of Canada Secretariat should regularly identify gaps that it needs to act on, develop plans to address these gaps, and act on these plans. (1.74–1.81)
The Treasury Board of Canada Secretariat’s response. Agreed. The Secretariat is presently developing guidance for departments and agencies to support the implementation of the new Treasury Board Policy on Evaluation (April 2009). Guidance in a number of key areas, including departmental evaluation planning and performance measurement strategies, is expected to be available to departments and agencies by 31 March 2010. Further guidance will be issued over the course of the 2010–11 fiscal year.
During the past two years, the Treasury Board of Canada Secretariat has issued guidance on the development of performance measurement frameworks required under the 2005 Policy on Management Resources and Results Structures, which was not covered by the current audit. In addition, through its annual Management Accountability Framework assessments, the Secretariat provides advice and support to departments on the quality and use of evaluation as well as on the quality of performance measurement frameworks associated with departmental Management, Resources and Results Structures.
The Secretariat continually consults with departments and agencies on their needs and monitors policy implementation to identify weaknesses within the government-wide evaluation function. Where the Secretariat determines a need to develop further tools, guidance, or other supports, it includes these activities in its business plans.
1.88 The Treasury Board of Canada Secretariat should ensure that it allocates sufficient resources to tasks that require evaluation expertise. (1.83–1.87)
The Treasury Board of Canada Secretariat’s response. Agreed. In renewing the Policy on Evaluation, the Government has strengthened its commitment to evaluating the value for money of federal programs and reaffirmed the Secretariat’s role of leading the evaluation function. In performing its functional leadership role established in the new policy, the Treasury Board of Canada Secretariat will ensure that the resources necessary for performing this role are considered and sufficient resources are allocated at the Secretariat to tasks that require evaluation expertise.
1.93 The Treasury Board of Canada Secretariat should help departments prepare to implement the new coverage requirements. During the transition period, the Secretariat should provide advice and guidance for effectiveness evaluation, focusing on programs where such evaluation can be put to best use. (1.89–1.92)
The Treasury Board of Canada Secretariat’s response. Agreed. The Secretariat is planning to issue written guidance for making risk-based choices for evaluation coverage to support departments during the transition period. This guidance is expected by 31 March 2010.
Throughout the transition period, the Secretariat will also help departments prepare to implement the new coverage requirements that come into effect after 31 March 2013. The Secretariat will provide leadership in the development and sharing of effective evaluation practices across departments, as well as support capacity-building initiatives in the evaluation function government-wide.

Definitions:
Direct program spending—Includes operating and capital spending and grants and contributions, but does not include public debt charges and major transfers to persons or other levels of government. (Return)
Reference level—The amount of funding that the Treasury Board has approved for departments and agencies to carry out approved policies and programs for each year of the planning period. (Return)
Effectiveness evaluation—An assessment of the extent to which programs are relevant, produce impacts and outcomes, achieve their objectives, and are cost-effective. Other types of evaluation examine program implementation and management. (Return)
Non-statutory grant and contribution programs—Programs whose spending authority is provided in an appropriation act that is voted on in Parliament in the Main and Supplementary Estimates, as opposed to programs whose spending authority comes from other legislation. (Return)
-->

Public Safety Canada at work not good!

Chapter 7—Emergency Management—Public Safety Canada
Main Points
Introduction
Focus of the audit
Observations and Recommendations
Establishing policies and programs
Establishing and exercising federal leadership has been a challenge
A consistent risk management approach is lacking
Coordinating federal emergency management
There has been progress in developing a government operations centre
Lessons learned have not been used to improve emergency response
Coordination is unclear for responses to chemical, biological, radiological, nuclear, or explosives emergencies
Promoting a common approach for response
Standards to promote interoperability are still under development
Protecting critical infrastructure
A strategy for protecting critical infrastructure has been slow to develop
Canada’s critical infrastructure remains undetermined
The energy and utilities sector is making progress on protecting critical infrastructure
Cyber security has recently received more attention, but significant challenges remain
Conclusion
About the Audit
Appendix—List of recommendations
Exhibits:
7.1—The Emergency Management Act dictates specific responsibilities for ministers
7.2—Information coordination and decision making for emergency response set out in the Federal Emergency Response Plan
7.3—The 10 critical infrastructure sectors show varied progress toward their emergency management being operational
Main Points
What we examined
Emergency management refers to a wide range of measures to protect communities and the environment from risks and to recover from emergency events stemming from either natural or human-induced causes. While some emergencies in Canada can be handled locally by municipalities or provinces, the federal government will assist when requested, when the emergency transcends jurisdictional boundaries, or when its assistance is in the national interest. As emergency events today can escalate quickly, this federal capability has become increasingly necessary.
Through legislation and government policy, Public Safety Canada, which was created in December 2003, is responsible for leading by coordinating the management of emergencies among federal departments and agencies. This includes establishing policies and programs for the preparation, testing and exercising, and implementing emergency management plans; it also includes monitoring and coordinating a common federal approach to emergency response along with the provinces—an “all-hazards” approach incorporating prevention and mitigation, preparedness, response, and recovery. The Department’s responsibility for emergency management includes coordinating the protection of critical infrastructure—from planning for emergencies to recovering from them. Critical infrastructure includes physical and information technology facilities, networks, services, and assets essential to the health and safety or economic well-being of Canadians.
We examined how Public Safety Canada carries out these responsibilities. In addition, we looked at its efforts to enhance emergency response and recovery in coordination with six other departments that have specific roles in emergency management. Our audit included assessing the government’s progress on some of the commitments it made to Parliament. Our audit covers performance of federal departments and agencies and events taking place since our last audit, reported in April 2005, and 15 June 2009.
We did not examine the performance of emergency management efforts by provinces, territories, or local communities.
Why it’s important
The H1N1 pandemic, the 2003 eastern seaboard power blackout, Severe Acute Respiratory Syndrome (SARS), massive flooding, and terrorist conspiracies and attacks have demonstrated that global trade, international travel, and cyberspace have increased the speed at which emergencies escalate in scope and severity. Today, many emergencies can be difficult to contain by a single government department or jurisdiction. A federal response is needed for emergencies that are beyond the capacities of other players—emergencies that may have a low probability of occurrence but a high potential impact.
Public Safety Canada is faced with the challenging task of providing the coordination necessary for an overall federal approach to emergency management, in an environment where departments have operated as needed and through their ministers to provide federal assistance on a case-by-case basis.
What we found
Public Safety Canada has not exercised the leadership necessary to coordinate emergency management activities, including critical infrastructure protection in Canada. For example, it has yet to develop the policies and programs that would help clarify its leadership and coordination role for an “all-hazards” approach to the emergency management activities of departments. Public Safety Canada has taken the first step by developing the interim Federal Emergency Response Plan, which it considers to be final although it has not been formally approved by the government. Nor does the Plan include updated or completed definitions of the roles, responsibilities, and capabilities needed for an integrated, coordinated approach to emergency response.
Public Safety Canada has made considerable progress in improving federal emergency coordination through its Government Operations Centre. It keeps other departments informed of the status of events on a real-time basis and also produces regular situation awareness reports for such issues as the H1N1 virus, which allows decisions to be based on a common set of facts.
Public Safety Canada has developed a strategy to protect Canada’s critical infrastructure, but this strategy is still in draft form. At the time of our audit, the critical infrastructure that needs to be protected had not yet been determined. Public Safety has moved forward in promoting a consistent approach to protection efforts across government. For example, it has categorized critical infrastructure into 10 sectors, each headed by a federal department. However, it has not provided those departments with guidance for determining what assets or facilities are critical and require protection.
Progress has been slow until 2009 on Public Safety Canada’s 2004 commitment to develop a cyber security strategy, although threats to computer-based critical infrastructure, including federal information systems, have been growing and evolving. To date, it has identified the key elements of a cyber strategy and initiated action on a list of current cyber security initiatives along with other federal government departments. However, at the time of our audit, no date was planned for obtaining formal approval of the strategy.
Although the 2004 National Security Policy called for first responders’ equipment and communications to be interoperable, key gaps remain for voice communications. This limits the ability of fire, police, and ambulance services to work together and with other jurisdictions in an emergency. The Department has directed little or no funding toward standardizing equipment.
The Department and the Privy Council Office have responded. The Department and the Privy Council Office agree with all of the recommendations that are addressed to them. Their detailed responses follow the recommendations throughout the chapter.
Introduction
7.1 Emergencies today can have a broader impact than those of the past. Examples of emergencies that have recently affected Canadians include the outbreaks of H1N1 and avian influenza, severe acute respiratory syndrome (SARS), listeriosis, and mad cow disease; the 1998 ice storm in Eastern Canada; and the 2003 power blackout across the eastern seaboard. Urban density, international travel, and global trade have increased the speed at which emergencies can escalate and spread. Today, many emergencies can be difficult to contain, and the impact is likely to be greater. A federal response is needed for those emergencies that are beyond the capacity of municipalities or individual provinces or territories—emergencies that may have a low probability of occurrence but can have a high potential impact. To be able to respond effectively to large-scale emergencies and reduce the potential loss of life and property damage, there needs to be extensive planning and coordination.
7.2 Under Canada’s Constitution Act, 1867, provinces and territories have primary responsibility for emergency management within their boundaries. Emergencies such as fires and floods may remain local in nature and, if so, may be effectively managed within the local resources of the municipality and province or territory. If an incident escalates, so do the response activities of various levels of government. At the request of a province or territory or where the type of emergency falls within federal jurisdiction or occurs on federal lands, the federal government provides help to manage and coordinate the response to an emergency. The Emergency Management Act (2007) established that the Department of Public Safety and Emergency Preparedness (Public Safety Canada) is responsible for responding to requests for assistance made by provinces and territories and for coordinating the assistance provided by other federal departments and agencies to the provinces and territories.
7.3 Following the events of September 11, 2001, the Canadian government changed its approach to emergency preparedness and response. At that time, there was a highly decentralized division of responsibilities among federal departments, provinces, and territories. In December 2003, the government created the Department of Public Safety and Emergency Preparedness, bringing together emergency preparedness, national security, and policing responsibilities within one federal department. This restructuring was intended to better integrate public safety efforts and link various federal programs more closely.
7.4 Building the capability to manage a coordinated federal response to an emergency of national significance is a huge undertaking and cannot be achieved overnight. In the past, federal departments had organized their emergency response actions as situations arose. However, given the changing nature of national emergencies, this is no longer sufficient. Recognizing this, the federal government issued the National Security Policy in April 2004, which called for the federal government to be prepared to play an enhanced role in modern emergency management and to improve collaboration among governments and other entities. The 2004 policy outlined a number of initiatives to enhance the safety and security of Canadians. It identified the need for an “all-hazards” approach, meaning that whether or not the cause of an emergency is malicious, accidental, or natural, the federal government would be prepared to respond. To facilitate this, the policy called for an updated emergency response system in which federal entities would work together in a coordinated manner. As well, it identified the need for federal departments and agencies to be more strongly linked with emergency operations at the provincial, territorial, and local levels.
7.5 The 2004 National Security Policy, our 2005 audit of national security and emergency preparedness, the House of Commons Standing Committee on Public Accounts, and the Senate Standing Committee on National Security and Defence all called for updated federal legislation to clearly define and ensure adequate emergency management powers and responsibilities for the Minister of Public Safety.
7.6 In 2005, the Department of Public Safety and Emergency Preparedness Act was passed. It stipulates that the Minister of Public Safety is to exercise “leadership at the national level relating to public safety and emergency preparedness.” When she appeared before the Standing Committee on Justice, Human Rights, Public Safety and Emergency Preparedness, the Minister explained that she would be responsible for coordinating the federal response to emergencies, while respecting the Prime Minister’s prerogative in matters relating to national security and to the statutory authorities of other ministers.
7.7 In January 2007, federal, provincial, and territorial ministers agreed that emergency management would adopt a comprehensive all-hazards approach. This approach would incorporate the four functions of emergency management: prevention and mitigation, preparedness, response, and recovery.
7.8 In August 2007, the Emergency Management Act came into force. It assigns to the Minister of Public Safety the responsibility to “exercise leadership relating to emergency management by coordinating federal emergency management activities” (Exhibit 7.1).
Exhibit 7.1—The Emergency Management Act dictates specific responsibilities for ministers
The Emergency Management Act requires the Minister of Public Safety to exercise leadership for emergency management by coordinating emergency management activities among federal departments and agencies, and in cooperation with the provinces and territories.
The Minister’s responsibilities include
establishing policies and programs, and providing advice to other departments for the preparation of their emergency management plans;
analyzing and evaluating emergency management plans prepared by federal entities;
monitoring potential and actual emergencies and coordinating the federal response to an emergency;
coordinating federal emergency management activities with those of the provinces, and through the provinces, those of local authorities;
coordinating the provision of assistance to a province;
promoting a common approach to emergency management, including the adoption of standards and best practices; and
conducting exercises and providing emergency management education and training.
As well, other federal ministers are to identify the risks that are within their area of responsibility, including those related to critical infrastructure, and to prepare, maintain, test, implement, and exercise emergency management plans in respect of those risks in compliance with the policies, programs, and other measures established by the Minister of Public Safety.
Source: Adapted from the Emergency Management Act
7.9 Public Safety Canada is the coordinating agency for federal departments, which have various roles to play in an emergency. Public Safety Canada is to ensure that the federal government is ready to respond to any future emergencies through the development of policies, standards, and plans that define roles and responsibilities. The aim is to eliminate the potential for confusion when responding in a crisis and provide a federal point for coordination.
7.10 If a department or agency has a clear mandate to respond to an emergency and is responsible to act, it is the subject matter expert. However, if emergencies escalate and spread, other federal departments may be required to play a role to manage the impact within their area of expertise. For example, for an incident involving a terrorist or criminal act, the Royal Canadian Mounted Police (RCMP) would be the primary federal response agency in its law enforcement role. For a natural disaster involving an earthquake or a power outage, Natural Resources Canada would be the primary subject matter expert. The Public Health Agency of Canada would be the subject matter expert for public health, including infectious diseases, in concert with Health Canada. Other departments or agencies, such as the Canada Border Services Agency, would play a supporting role. In each of these examples, Public Safety Canada plays a coordinating role in helping to receive information and communicate the current situation to other departments and agencies, and to senior officials in the federal government and other jurisdictions.
7.11 For emergency management, Public Safety Canada had a budget of $58.5 million and 400 employees for the 2008–09 fiscal year. Many of Public Safety Canada’s emergency management programs are delivered through 11 regional offices. It also manages the federal Government Operations Centre that monitors emerging threats and provides round-the-clock coordination and support to government entities in the event of a national emergency. As well, it oversees the conduct of exercises on emergency management at the national level and an inter-jurisdictional training program for local frontline emergency workers at its Canadian Emergency Management College.
7.12 Public Safety Canada’s role as the lead department for coordinating federal emergency management includes critical infrastructure protection. Critical infrastructure consists of physical and information technology facilities, networks, services, and assets essential to the health and safety or economic well-being of Canadians, and the effective functioning of government. Examples of critical infrastructure include food, water, and energy supplies; health services; financial systems; and communication networks, including the Internet. Events such as the 1998 ice storm in Eastern Canada and the 2003 power blackout across the eastern seaboard highlight the impact of the failure of the electrical grid. The vast majority of Canada’s critical infrastructure is owned by the private sector or managed through another level of government. This creates a challenge for the federal government to establish its role with owners and operators and thereby ensure the protection and resiliency of the nation’s critical infrastructure.
Focus of the audit
7.13 In this audit, we focused on four main responsibilities of Public Safety Canada:
To establish policies and programs for emergency management plans and operations, provide advice to departments, and evaluate their plans.
To coordinate the emergency management activities among federal government institutions along with those of the provinces and territories.
To promote a common approach to emergency management, including the adoption of standards and best practices.
To coordinate the protection of Canada’s critical infrastructure.
7.14 Specifically, we examined Public Safety Canada’s responsibility to lead by coordinating the efforts of other federal entities and by coordinating federal efforts with those of the provinces and territories. We focused mainly on the Department’s preparedness efforts, including its coordination of the provision of critical infrastructure protection. As well, we examined progress by Public Safety Canada in enhancing emergency response and recovery in coordination with government departments and agencies.
7.15 We did not examine the performance of provinces, territories, or local communities in their delivery of emergency management services or activities, nor did we examine provincial and territorial or private sector critical infrastructure protection efforts. We also did not examine the security activities carried out in preparation for the 2010 Olympic and Paralympic Games, as responsibility for these activities was assigned to the Office of the Coordinator for 2010 Olympics and G8 Security, which reports to the National Security Advisor.
7.16 More details on the audit objectives, scope, approach, and criteria are in About the Audit at the end of this chapter.
Observations and Recommendations
Establishing policies and programs
Establishing and exercising federal leadership has been a challenge
7.17 Public Safety Canada is responsible under legislation to exercise leadership through planning, establishing policies and programs for emergency preparedness, cooperating with provinces and territories, and promoting a common approach to emergency management. It is responsible for coordinating the emergency management activities of various federal departments and agencies and fostering a cooperative approach to responding to emergencies.
7.18 Because the subject matter expertise and experience for dealing with emergencies resides in several different departments, Public Safety Canada has an important role to ensure that all potential hazards are addressed, that plans exist and have been shared and tested, and that, during a crisis, the kind of response needed is quickly established without confusion. However, Public Safety Canada does not assume control over other departments or tell them how to do their jobs. Each department remains responsible to its own minister and for acting as required under its own legislation. Public Safety Canada, under the Emergency Management Act, is responsible for establishing policies and programs that other ministers must follow in carrying out their emergency management responsibilities and determining how they will be coordinated. Given different mandates and accountabilities, it is important that Public Safety Canada know who it should communicate with and ensure that the various departments know how coordination will proceed and what the expected operating procedures will be. Nevertheless, each department determines whether it will assist during an emergency, what its role will be, and how it will operate with other federal, provincial, or territorial partners.
7.19 We found that while Public Safety Canada played a coordination role in some emergencies, including participating in the development of response plans for avian and pandemic influenza, it has yet to establish the policies and programs that would help define its leadership and coordination role for emergency management in an all-hazards environment. Defining a leadership role when each department responds to its own ministerial direction, and coordinating that direction with other departments can be a challenge. Nevertheless, Public Safety Canada was established to address these concerns and determine how to coordinate and harmonize the activities of the different departments needed to deal with today’s complicated and broad-reaching situations.
7.20 In order to move forward in its mandate to exercise leadership, Public Safety Canada needs to have experienced and knowledgeable staff in place. Another challenge we noted was that the Department has had difficulty attracting and retaining senior managers to provide the direction needed in its emergency management. This area of Public Safety Canada had an employee vacancy rate of 39 percent in the 2008–09 fiscal year and a vacancy rate of 50 percent the previous year. In April 2009, only 56 percent of senior managers had been in their jobs for more than 18 months. Turnover and change of staff has been particularly problematic, and in the 2008–09 fiscal year, the rate of employee movement (including appointments, promotions, deployments, acting assignments, and departures) was 71 percent in emergency management.
7.21 In 2006, Public Safety Canada was allocated approximately $115 million over five years to enhance its core capacity for emergency management; and, in 2008, it was allocated a further $28 million over five years. In the 2008–09 fiscal year, Public Safety Canada had an annual budget of $58.5 million for emergency management. However, it had not spent one third of its budget for emergency management in each of the past two years. In this context, it is evident that Public Safety Canada has been unable to develop its capacity for emergency management.
7.22 In the face of these challenges, Public Safety Canada has taken the first steps toward establishing its leadership role by developing the interim Federal Emergency Response Plan, a framework for coordinating emergency response activities across government. Work has been under way on developing this plan, in various forms, since 2004. In June 2005, the House of Commons Standing Committee on Public Accounts recommended that Public Safety Canada obtain formal support for its plan from other departments. At the time of our audit, the Plan was still an outline of the requirements of an emergency response plan. The Plan has been presented to an interdepartmental committee of assistant deputy ministers. Although it has not been formally approved by Public Safety Canada or endorsed by other departments, officials told us that it is, nevertheless, being considered final.
7.23 While the framework may be considered complete, the roles and responsibilities and the capabilities (contained in its annexes) needed for an integrated, coordinated approach to emergencies have not been updated or completed. Department officials told us that details on how the federal plan supports provincial and territorial plans and capabilities are being drafted. At the time of our audit, Public Safety Canada expected to share the draft document with provincial and territorial representatives in September 2009. While we recognize that the Federal Emergency Response Plan will always need to be updated to reflect changes in policies and practices, it is a significant policy document that, with formal government approval, would provide proper authority and clear support to Public Safety Canada.
7.24 The Federal Emergency Response Plan outlines a decision-making process to help coordinate a federal response to emergencies. Since 2006, an interdepartmental assistant deputy ministers’ committee for emergency management (now, ADM-EMC) has met regularly to discuss emergency management priorities and to make decisions to guide federal government actions during emergencies. Depending on the severity of a situation, this committee may make decisions, or may refer the issue to the Federal Coordinating Officer (usually the Deputy Minister of Public Safety), who may refer the issue to a committee of deputy ministers. Similarly, the issue may be referred to Cabinet or, ultimately, to the Prime Minister. The federal emergency response structure is summarized in Exhibit 7.2. The ADM-EMC is co-chaired by Public Safety Canada and the Privy Council Office to facilitate the sharing of information should decisions need to be taken to a higher level. The ADM-EMC has served as the coordinating body for events such as the 2007 floods in British Columbia and the H1N1 virus pandemic in 2009.
Exhibit 7.2—Information coordination and decision making for emergency response set out in the Federal Emergency Response Plan
Source: Adapted from the Federal Emergency Response Plan (April 2009)
7.25 As part of our audit, we reviewed federal responses to six emergencies that occurred between August 2006 and May 2009, where multiple federal departments were involved and for which after-action reports were available. We tried to determine whether the Federal Emergency Response Plan was used as the framework for a coordinated response. In each of these cases, the Government Operations Centre was used to varying degrees to share information and analysis among entities. However, the ADM-EMC, the body responsible for coordinating the federal response to an emergency, did not meet to discuss possible responses during three of these six emergencies. According to after-action reports prepared by participating departments for these emergencies, there were problems in coordinating the federal response among departments and agencies in all cases. Roles and responsibilities needed to achieve a coordinated approach were not well understood and some established practices were not followed. At the time of our audit, the ADM-EMC intended to clarify roles and responsibilities in its decision-making process.
7.26 Recommendation. The Privy Council Office and Public Safety Canada should ensure that all components of the Federal Emergency Response Plan are completed and should obtain government approval for the plan.
The Privy Council Office and the Department’s response. Agreed. The Privy Council Office and Public Safety Canada will seek approval for the completed Federal Emergency Response Plan (FERP) at the earliest possible date and the supporting Emergency Support Functions (ESFs) prior to the end of the 2009–10 fiscal year. Public Safety Canada will seek approval of the National Emergency Response System (NERS), an annex to the FERP, which articulates how the FERP supports provincial and territorial emergency response plans, by the end of August 2010. Public Safety Canada will organize information sessions with departmental executive committees to brief departments on the FERP and their associated roles and responsibilities. The FERP and its components will be maintained as an evergreen document.
A consistent risk management approach is lacking
7.27 In order to be ready to respond, emergency management plans need to address the most important risks. In 2007, the Deputy Ministers’ Committee directed Public Safety Canada to assess the federal government’s state of readiness for a national emergency. Through this review process, a number of capability gaps were identified; however, Public Safety Canada did not have a framework upon which to prioritize or rank the severity of the gaps and, as a result, has not moved forward with an action plan to address these gaps. As well, the review found that Public Safety Canada lacked an all-hazards risk assessment that identified potential hazards to public safety or security—whether malicious, natural, or accidental. It also lacked a framework to determine required capabilities to respond to these risks.
7.28 The 2004 National Security Policy and the 2007 Emergency Management Act recognized that the federal government needed to better understand Canada’s vulnerability to emerging risks and use this information to develop comprehensive emergency plans and programs. Under the Emergency Management Act, federal departments are to identify risks that are within their area of responsibility, and prepare emergency plans in respect of those risks according to the policies established by Public Safety Canada. Under its leadership role for emergency management activities, Public Safety Canada is to coordinate risk assessments in collaboration with other federal departments and to ensure that they have proper emergency management plans and preparedness measures in place.
7.29 We found that Public Safety Canada has made limited progress in developing the guidance that departments need to achieve a consistent approach when identifying their risks and their emergency management plans and programs. A comprehensive risk and vulnerability assessment to guide the development of plans and response capabilities under an all-hazards approach has not been conducted in Canada. A Public Safety Canada study conducted in 2008 of 36 federal departments found wide variation in the risk assessment processes used by departments to guide the development of plans and capabilities. Some departments had no process in place. In the six federal departments we examined, we found that none had received any guidance from Public Safety Canada on conducting risk assessments for emergency planning, yet all of these departments were working to update their plans. Public Safety Canada initiated a project in April 2009 to streamline and validate these risk assessment processes for emergency planning and capabilities development. This project is in the preliminary planning stage.
7.30 The Emergency Management Act stipulates that Public Safety Canada is responsible for reviewing departmental emergency management plans, which includes departmental business continuity plans. These plans are needed so that federal organizations can continue operating during an emergency. Under the Emergency Management Act, Public Safety Canada is responsible for ensuring that business continuity plans are complementary and meet the overall needs of the federal government. It had provided a self-assessment tool for departments to review their own business continuity plans. However, at the time of our audit, Public Safety Canada had not formally analyzed or evaluated departmental business continuity plans, nor did it have plans to do so. It had not determined whether there were gaps between departments.
7.31 Recommendation. As stipulated in the Emergency Management Act, Public Safety Canada should establish policies and programs and provide advice for departments to follow when identifying risks and developing their emergency management plans.
The Department’s response. Agreed. In keeping with the all-hazards approach to emergency management, Public Safety Canada is leading the development of an Emergency Management Planning Framework that will provide departments and agencies with guidance, tools, and best practices for developing emergency management plans. It is also working with federal departments to develop an all-hazards risk assessment framework. Under the Emergency Management Act, it is the responsibility of each minister accountable to Parliament for a government institution to identify the risks that are within or related to his or her area of responsibility.
7.32 Recommendation. As stipulated in the Emergency Management Act, Public Safety Canada should analyze and evaluate the emergency management plans prepared by departments to ensure that they are prepared according to the policies, programs, and advice provided, and it should identify potential gaps or risks to a coordinated emergency management response.
The Department’s response. Agreed. Public Safety Canada is developing the Emergency Management Planning Framework, which will include performance measurements that will allow Public Safety Canada to analyze and evaluate emergency management plans produced by departments and agencies. The Framework will also include self-assessment tools for departments and agencies. Public Safety Canada is currently developing an approach to implement this initiative.
Coordinating federal emergency management
There has been progress in developing a government operations centre
7.33 In 2004, Public Safety Canada established the Government Operations Centre as the core of its federal coordination efforts for events of national significance. The role of the Government Operations Centre is not to act as a decision-making body in an emergency response, but to assemble and communicate information to decision makers. It is connected with the operations centres of 20 federal departments and agencies, as well as with those of the provinces and territories, and other countries, including the United States.
7.34 The Government Operations Centre has coordinated information and analysis among federal departments and provinces for numerous events since its inception. The scope of the emergency determines the scale and extent of its functions. However, it has not clearly defined when or why its level of activation changes in response to the severity of events and what this means for participating departments. A government-wide exercise, conducted in February 2009 by Public Safety Canada, found that information analysis and sharing at the operations centre was poor. Furthermore, officials at Public Safety Canada told us that the Government Operations Centre did not have the physical facilities to support the number of staff needed to keep the operations centre fully functional for a major emergency lasting an extended period of time. Public Safety Canada was in the process of determining what corrective actions were needed as we completed our audit work.
7.35 Public Safety Canada has made considerable progress in federal emergency coordination through its Government Operations Centre, as the centre operates on a continual basis and can track many potential or evolving events. It keeps other departments informed of the status of events on a real-time basis and alerts them if the events escalate into a more serious situation. The centre produced regular situation awareness reports for such issues as the H1N1 virus pandemic and Manitoba’s spring flooding in 2009, which allowed decisions to be based on a common set of facts. We noted that the Government Operations Centre reviewed how well it performed after events, but this was a verbal process. Results from these reviews are not normally tracked or monitored to ensure that corrective action is implemented.
Lessons learned have not been used to improve emergency response
7.36 In order for response plans to be reliable during an emergency, they must be regularly exercised, especially the plans for coordination between departments and agencies and between different levels of government. The National Security Policy and the Emergency Management Act call for regular exercises to assess the adequacy of emergency response plans in various scenarios. In 2004, the National Exercise Division was established within Public Safety Canada, with resources dedicated to staging regular national exercises at the federal, provincial, and municipal levels and consolidating lessons learned to improve future performance.
7.37 Over the past three years, Public Safety Canada budgeted a total of $17.1 million to plan and conduct exercises related to emergency management across the federal government and with the provinces and municipalities, as well as to share lessons learned and best practices with exercise participants. However, over half of the budget allocated to national exercises was not spent in each of the last three fiscal years. Public Safety Canada maintains a calendar that lists exercises planned among federal departments and has developed a framework for federal departments and agencies to coordinate their national exercise efforts. Since April 2005, Public Safety Canada has coordinated five federal exercises, shared in the coordination of eight multi-jurisdictional exercises, and participated in an additional two exercises. However, we found that exercises were designed to meet the training objectives of individual departments, rather than to test the government’s overall coordination or readiness for a national emergency against identified risks. Public Safety Canada recognizes the need to increase the number of federal and multi-jurisdictional exercises.
7.38 In response to our April 2005 audit, Public Safety Canada committed to consolidating, on an ongoing basis, the results of lessons learned; however, at the time of our audit, it had not done so. The Department provided us with after-action reports for 14 of the exercises it coordinated or participated in since April 2005, but observations and recommendations from these reports were not systematically collected and used to improve emergency plans and operations.
Coordination is unclear for responses to chemical, biological, radiological, nuclear, or explosives emergencies
7.39 Following the events of September 11, 2001, Canada focused its attention on the significant threats posed by terrorist attacks and on the need to enhance readiness against emergencies caused by people, whether deliberate or accidental. In Budget 2001, the federal government allocated $513 million over six years to federal departments and agencies to improve their ability to respond to chemical, biological, radiological, or nuclear (CBRN) events, as these types of emergencies are beyond the response capacity of provinces, territories, and municipalities. Public Safety Canada receives $2.7 million annually for CBRN training. The initiative is currently being expanded to include the possibility of a threat due to explosives (CBRNE).
7.40 We examined the status of efforts made to improve CBRNE response capability, where a coordinated and integrated approach among federal departments, as well as provincial and local jurisdictions, is essential to success. To enhance the capacity of local emergency workers to respond to a CBRNE event, Public Safety Canada leads a training program for first responders from municipal, provincial, and territorial governments, with a combined annual federal budget of $12 million. From April 2003 to April 2009, it had trained 1,854 local first responders to assist during an event, and a further 10,400 had received awareness training. While Public Safety Canada has administered participant questionnaires and consulted experts and other government departments, it has not conducted a formal needs analysis for its first responder training.
7.41 In 2002, a federal team was established to prepare for and respond to potential CBRNE events, combining the efforts of the RCMP and National Defence; at the time, Health Canada; and, since 2004, the Public Health Agency of Canada. We expected that Public Safety Canada would lead the efforts of these departments, and we looked for evidence of joint planning and execution to develop the capabilities needed for a coordinated response and recovery.
7.42 Public Safety Canada is responsible for setting the overall federal policy on CBRNE issues. In 2005, it issued a federal strategy, identifying the roles and responsibilities of federal departments and agencies for an effective response to these types of emergencies. However, it did not address how federal departments and agencies would coordinate their resources with those of the provinces, territories, and municipalities to assist them in a national emergency, nor has it expanded the strategy to include explosives. At the time of our audit, Public Safety Canada was consulting with the provinces and territories to develop a national CBRNE strategy that included their responsibilities.
7.43 While the current strategy states that the government is to take all possible measures to pre-empt, prevent, mitigate, and respond effectively to a potential CBRNE incident, it has not identified the desired capability, mandate, roles, or priorities for crisis or consequence management for the responsible federal organizations. The role of the federal CBRNE team is to manage the crisis phase of an emergency; however, the team does not have the resources to manage the after effects of a CBRNE incident, including assisting in mass casualty evacuation, medical aid, or decontamination. In August 2008, the three departments involved in the federal CBRNE response team informed Public Safety Canada of their concerns with the team’s mandate, capacity, training, and the compatibility of communications equipment. While the responsibilities of each team member were clear, there were no defined operational protocols or agreements on how the team would work together in a coordinated manner. Team members felt that it was the responsibility of Public Safety Canada to define protocols and formalize agreements among members. At the time of our audit, these issues had not been resolved. Public Safety Canada officials told us that the role it could play in this type of emergency is unclear, as the three departments on the federal CBRNE team have the expertise, resources, and responsibility, while Public Safety Canada has none of these.
7.44 Recommendation. As stipulated in the Emergency Management Act, Public Safety Canada should ensure that its coordination role for the federal response to an emergency is well-defined and that the operational policies and plans that departments will follow are updated and consistent.
The Department’s response. Agreed. Public Safety Canada will maintain the Federal Emergency Response Plan and its components as an evergreen document. This includes ensuring the development of policies and event-specific plans that outline operational protocols and departmental roles and responsibilities, and reviewing these plans to ensure a coordinated approach as necessary.
Promoting a common approach for response
Standards to promote interoperability are still under development
7.45 The 2004 National Security Policy called for equipment and communications to be interoperable or compatible so that first responders could work together better. In response to our 2005 audit chapter, Public Safety Canada agreed to collaborate with a research group to develop standards for equipment for use in chemical, biological, radiological, or nuclear emergencies. The equipment is used in a variety of emergency response situations, and it includes fire and heavy urban search and rescue vehicles, personal suits, gear worn by first responders to protect against hazardous materials, and communications systems.
7.46 First responders have identified voice communications as the main constraint to their interoperability. Capability gaps remain in communications interoperability that limit the ability of fire, police, and ambulance services to talk to one another and to communicate across jurisdictions during an emergency. Public Safety Canada officials told us that its role is not to establish standards but to assist first responder groups that purchase and use the equipment to develop their own standards. Public Safety Canada completed a draft document on a national approach for communications interoperability but has yet to present the draft to provincial officials for approval. For other types of equipment, Public Safety Canada is currently assisting groups to establish standards for personal protective equipment.
7.47 As noted in our 2005 audit, while the federal government could use directed funding to promote standardized equipment, officials told us that it has not done more due to a lack of resources. About $5 million in federal funding is available through an existing cost-shared program. Under this program, choices of equipment purchases are left to the provinces.
Protecting critical infrastructure
A strategy for protecting critical infrastructure has been slow to develop
7.48 Public Safety Canada is the lead federal department for coordinating the protection of Canada’s critical infrastructure. Critical infrastructure consists of those physical and information technology facilities, networks, services, and assets that, if disrupted or destroyed, would have a serious impact on the health, safety, and security or economic well-being of Canadians or the effective functioning of governments in Canada. The Emergency Management Act stipulates that the Minister of Public Safety is to provide advice and to analyze and evaluate federal departmental emergency management plans, which include critical infrastructure plans.
7.49 We examined whether Public Safety Canada was providing a leadership role in developing and implementing a national strategy for critical infrastructure protection. Namely, we examined its initiatives to provide advice and promote standards to other federal and provincial or territorial authorities.
7.50 In February 2001, the federal government identified the need to provide national leadership to protect Canada’s critical infrastructure from the risks of failure or disruption. Following the terrorist attacks in the United States on September 11, 2001, the federal government allocated $190 million over five years to improve critical infrastructure protection and emergency management capacity across the federal government. In 2004, the National Security Policy directed the federal departments to work with provinces, territories, and the private sector on initiatives to improve national capabilities to protect critical infrastructure. In its 2004–05 Report on Plans and Priorities, Public Safety Canada committed to the development and release of a National Strategy for Critical Infrastructure by spring 2005.
7.51 While not meeting its target date in 2005, Public Safety Canada started to work with provinces, territories, and the private sector to develop a plan to implement a proposed National Strategy for Critical Infrastructure. In the strategy, 10 key sectors involved in critical infrastructure were identified, and federal departments were designated to head each sector. We found that Public Safety Canada has consulted with representatives of government and private sector organizations in order to draft the National Strategy for Critical Infrastructure. It expects that implementation will take three years once the strategy is formally approved. Department officials told us that they are continuing to work on implementation while they await formal approval. From its monitoring, Public Safety Canada found that progress was more advanced in some sectors than in others toward completion of the 37 milestones necessary for their emergency management to be fully operational (Exhibit 7.3).
Exhibit 7.3—The 10 critical infrastructure sectors show varied progress toward their emergency management being operational
Source: Based on data provided by Public Safety Canada
7.52 At the time of our audit, Public Safety Canada had started to develop guidance to promote a consistent approach to critical infrastructure risk assessments and protection efforts. However, this guidance had not been finalized or distributed to departments designated to head the sectors. We found that, in the absence of guidance from Public Safety Canada, departments have been developing their own approaches, without the assurance that they will result in plans that are coordinated and consistent across government.
Canada’s critical infrastructure remains undetermined
7.53 With the proposed critical infrastructure strategy, Public Safety Canada has taken the first step toward getting a complete picture of the infrastructure considered important at the federal, provincial, and municipal levels. However, to get this picture requires input from many different partners in government and the private sector. At the time of our audit, the critical infrastructure that needs to be protected had not yet been determined. Public Safety Canada had begun to map the infrastructure of 14 major Canadian cities. However, none of this information had been validated for its significance at the federal level. While certain assets may be deemed critical to an industry, municipality, province, or territory, those assets may not be critical at the federal or national level. This information is key for industry and all levels of government to allocate resources and develop their own protection plans.
7.54 There have been challenges to progress, specifically
the determination of which critical infrastructure needs protection;
the determination of what resources are available to protect critical infrastructure at the federal, provincial, and territorial levels and of where weaknesses or gaps exist; and
hesitation on the part of private sector owners and operators of some infrastructure to share information that would identify potential vulnerabilities that could provide competitors with an advantage.
7.55 While Public Safety Canada can move forward to develop policies and programs without resolving these issues, the unresolved issues will remain an impediment to achieving full success if they are not addressed. The proposed national strategy includes information sharing and protection as a key strategic objective, and the Access to Information Act has been amended to protect critical infrastructure information supplied by third parties. However, department officials at Public Safety Canada told us that, while they can provide advice and coordination to departments, it is the responsibility of operational departments to identify Canada’s critical infrastructure and determine how it should be protected before a national, coordinated approach can be implemented.
7.56 Public Safety Canada has provided no guidance to departments to ensure that they determine what critical infrastructure needs to be protected. Furthermore, there is little guidance to departments responsible for sectors to determine what assets or facilities are critical to the federal government. This information is essential for a coordinated approach to critical infrastructure protection.
The energy and utilities sector is making progress on protecting critical infrastructure
7.57 We examined the energy and utilities sector in more detail as it was seen to have made considerable progress in efforts to identify and protect critical infrastructure. Led by Natural Resources Canada, the sector is organized and has regular meetings and classified briefings to industry and government officials.
7.58 Natural Resources Canada is in the process of adding infrastructure information to maps, including not only pipelines and transmission lines, but also railways, telecommunications, and strategic buildings and structures. While these efforts are expected to complement the proposed national strategy, we note that this project was developed separately from Public Safety Canada, which has since initiated its own separate mapping of critical infrastructure.
Cyber security has recently received more attention, but significant challenges remain
7.59 Threats to computer-based critical infrastructure, including federal information systems, are evolving and growing. In April 2009, the Minister of Public Safety stated that there have been repeated attacks against this country’s computer systems. These cyber attacks may be initiated by individuals or groups and may be unintentional, amateur, or foreign state-sponsored espionage and information warfare, and present an ever-changing and evolving threat. Cyber attacks could have very damaging consequences. For example, computer and communications networks are used to control such things as our electrical grid, with varying vulnerabilities. Recently, the United States and the United Kingdom have significantly increased their efforts to fight cyber threats.
7.60 Public Safety Canada is in the process of developing a cyber security strategy—a commitment first made in the 2004 National Security Policy. While it has been working on a draft strategy, at the time of our audit, it had no date scheduled for its formal approval. Although the commitment was made in 2004, progress has been slow until this past year. Public Safety Canada has identified the key elements of a cyber strategy and has initiated action on a list of current cyber security initiatives along with other federal government departments.
7.61 Recommendation. Based on the responsibilities outlined in the Emergency Management Act, Public Safety Canada should provide policies and guidance for departmental sector heads to determine their infrastructure and assess its criticality, based on risk and its significance to the safety and security of Canadians; it should establish policies and programs to prepare plans to protect the infrastructure.
The Department’s response. Agreed. Based on the responsibilities outlined in the Emergency Management Act, Public Safety Canada will provide tools and guidance for sectors to determine their processes, systems, facilities, technologies, networks, assets, and services. Public Safety Canada will also provide tools and guidance for departmental sector heads to assess the infrastructure’s criticality based on risks and its significance to the safety and security of Canadians, and will establish policies and programs to prepare plans for their protection.
Conclusion
7.62 We found that Public Safety Canada has not exercised the leadership necessary to coordinate emergency management activities, including protection of critical infrastructure in Canada. While it has a challenging role, Public Safety Canada still needs to develop the policies and programs that would help clarify its leadership and coordination role for the emergency management activities of operational departments. Public Safety Canada has taken the first step by developing the interim Federal Emergency Response Plan. In our opinion, to make further progress, the plan would benefit from formal government approval and a better definition of roles and responsibilities of all players, as well as the capabilities needed for an integrated, coordinated approach to emergency response.
7.63 Public Safety Canada has drafted a strategy to protect Canada’s critical infrastructure, but it has not been formally approved. However, at the time of our audit, the critical infrastructure that needs to be protected had not yet been determined. It has categorized critical infrastructure into 10 sectors, each headed by a federal department.
7.64 We found that Public Safety Canada had made slow progress until this past year on its 2004 commitment to develop a cyber security strategy, although threats to computer-based critical infrastructure, including federal information systems, are evolving and growing. While it has been working on a draft strategy, at the time of our audit, it had no date scheduled for its formal approval. Public Safety Canada has identified the key elements of a cyber security strategy and has initiated action on a list of current cyber security initiatives along with other federal departments and agencies.
7.65 Over the period of our audit, Public Safety Canada, along with other federal departments and agencies, had made limited progress in enhancing the response to and recovery from emergencies in a coordinated manner. However, their rate of progress has improved, especially in the past year. Public Safety Canada has established a Government Operations Centre, which is connected to other federal departments and agencies. The centre has enabled Public Safety Canada to make considerable progress in coordinating response activities in times of crisis, as it keeps other departments informed of the status of events on a real-time basis. It also produces regular situation awareness reports for such issues as the H1N1 virus, which allow decisions to be based on a common set of facts. However, improvements can be made in identifying and implementing lessons learned from real emergencies and exercises. In its responsibility as the lead federal department for emergency management policies and plans, including chemical, biological, radiological, nuclear, and explosives, Public Safety Canada has not clarified the decision-making processes and operational protocols for emergency response activities.
7.66 Public Safety Canada is making progress in promoting standards for personal protective equipment used in responding to emergencies. However, key interoperability gaps remain for voice communications, limiting the ability of various fire, police, and ambulance services to work together in an emergency. The Department has directed little or no funding toward standardizing equipment.
About the Audit
All of the audit work in this chapter was conducted in accordance with the standards for assurance engagements set by The Canadian Institute of Chartered Accountants. While the Office adopts these standards as the minimum requirement for our audits, we also draw upon the standards and practices of other disciplines.
Objectives
The objectives of this audit were to
determine whether Public Safety Canada can demonstrate that it has exercised leadership by coordinating emergency management activities, including critical infrastructure protection in Canada; and
determine whether Public Safety Canada, along with federal departments and agencies, can demonstrate progress in enhancing the response to and recovery from emergencies in a coordinated manner.
Scope and approach
This audit examined federal efforts to improve the nation’s readiness and resiliency to respond to incidents or attacks, through improved coordination of emergency management activities at the federal level, and through work with provinces and territories to achieve unified and integrated response and recovery operations. While the focus of the audit was Public Safety Canada, audit work was also conducted at the Privy Council Office, National Defence, the Royal Canadian Mounted Police, the Public Health Agency of Canada, Health Canada, Natural Resources Canada, the Canadian Food Inspection Agency, and the Canada Border Services Agency.
We followed up on selected recommendations made in our April 2005 chapter, National Security in Canada, regarding emergency preparedness, including response capabilities for a chemical, biological, radiological, or nuclear event. Public Safety Canada has the lead responsibility for addressing the majority of these recommendations. We also followed up on selected recommendations from the June 2005 House of Commons Standing Committee on Public Accounts report that supported our audit chapter with several recommendations to federal departments.
The audit did not examine emergency management activities of the provinces and territories; it focused on Public Safety Canada’s coordination of emergency management among federal departments along with the provinces and territories. The audit did not examine the security activities carried out in preparation for the 2010 Olympic and Paralympic Games as responsibility for these activities was assigned to the Office of the Coordinator for 2010 Olympics and G8 Security reporting to the National Security Advisor.
Criteria
Listed below are the criteria that were used to conduct this audit and their sources.
Criteria
Sources
We expected that Public Safety Canada would exercise leadership by coordinating federal emergency management activities, as described in legislation and policies.
Department of Public Safety and Emergency Preparedness Act
Emergency Management Act, sections 3 and 4
National Security Policy (2004), page 22
We expected that Public Safety Canada would coordinate federal emergency management activities with those of the provinces and territories to provide timely and coordinated support to communities in an emergency.
Emergency Management Act, section 4.1(f)
National Security Policy (2004), page 25
We expected that Public Safety Canada would regularly test and exercise federal emergency management plans.
Emergency Management Act, section 4.1(a)
National Security Policy (2004), page 27
National Security in Canada: Report of the Standing Committee on Public Accounts, June 2005, page 10
We expected that Public Safety Canada would have a risk-based plan to lead and coordinate critical infrastructure protection efforts, and to reduce vulnerability to cyber attacks and accidents, by
adopting an all-hazards approach
agreeing upon roles and responsibilities for the federal government and others
determining what critical infrastructure should be protected
assessing the threats and risks to these assets
prioritizing risks and resources to protect critical infrastructure
implementing protective programs
developing measures to monitor and assess effectiveness
National Security Policy (2004), page 26
Public Safety Canada, Securing an Open Society: One Year Later (2005), page 23
Emergency Management Act, sections 3 and 4
We expected that Public Safety Canada and selected federal entities would use a risk-based approach to identify the resources needed and to coordinate the response to and recovery from emergencies.
Emergency Management Act, sections 4 and 6
Treasury Board of Canada Secretariat, Management Accountability Framework, Round V—Risk Management, sections 9.1 to 9.4
Treasury Board of Canada Secretariat, Integrated Risk Management Framework
We expected that Public Safety Canada would promote a common approach to emergency management, including the adoption of standards and best practices.
Emergency Management Act, section 4
National Security Policy (2004), page 26
We expected that Public Safety Canada, together with its federal partners, would provide emergency management training, based on a needs assessment and risk-based plan.
Emergency Management Act, section 4(1)(n)
Government of Canada, The Budget Plan 2001, page 100
Management reviewed and accepted the suitability of the criteria used in the audit.
Period covered by the audit
This audit covers the performance of federal departments and agencies and events taking place since our last audit of this subject reported in April 2005.
Audit work for this chapter was substantially completed on 15 June 2009.
Audit team
Assistant Auditors General: Wendy Loschiuk and Hugh McRoberts
Principal: Gordon StockDirector: Carol McCalla
Jenna LindleySean MacLennanSteven MarianiJohn-Patrick MooreNicolette O’ConnorBridget O’GradyStacey Wowchuk
For information, please contact Communications at 613-995-3708 or 1-888-761-5953 (toll-free).
Appendix—List of recommendations
The following is a list of recommendations found in Chapter 7. The number in front of the recommendation indicates the paragraph where it appears in the chapter. The numbers in parentheses indicate the paragraphs where the topic is discussed.
Recommendation
Response
Establishing policies and programs
7.26 The Privy Council Office and Public Safety Canada should ensure that all components of the Federal Emergency Response Plan are completed and should obtain government approval for the plan. (7.17–7.25)
The Privy Council Office and the Department’s response. Agreed. The Privy Council Office and Public Safety Canada will seek approval for the completed Federal Emergency Response Plan (FERP) at the earliest possible date and the supporting Emergency Support Functions (ESFs) prior to the end of the 2009–10 fiscal year. Public Safety Canada will seek approval of the National Emergency Response System (NERS), an annex to the FERP, which articulates how the FERP supports provincial and territorial emergency response plans, by the end of August 2010. Public Safety Canada will organize information sessions with departmental executive committees to brief departments on the FERP and their associated roles and responsibilities. The FERP and its components will be maintained as an evergreen document.
7.31 As stipulated in the Emergency Management Act, Public Safety Canada should establish policies and programs and provide advice for departments to follow when identifying risks and developing their emergency management plans. (7.27–7.30)
The Department’s response. Agreed. In keeping with the all-hazards approach to emergency management, Public Safety Canada is leading the development of an Emergency Management Planning Framework that will provide departments and agencies with guidance, tools, and best practices for developing emergency management plans. It is also working with federal departments to develop an all-hazards risk assessment framework. Under the Emergency Management Act, it is the responsibility of each minister accountable to Parliament for a government institution to identify the risks that are within or related to his or her area of responsibility.
7.32 As stipulated in the Emergency Management Act, Public Safety Canada should analyze and evaluate the emergency management plans prepared by departments to ensure that they are prepared according to the policies, programs, and advice provided, and it should identify potential gaps or risks to a coordinated emergency management response. (7.27–7.30)
The Department’s response. Agreed. Public Safety Canada is developing the Emergency Management Planning Framework, which will include performance measurements that will allow Public Safety Canada to analyze and evaluate emergency management plans produced by departments and agencies. The Framework will also include self-assessment tools for departments and agencies. Public Safety Canada is currently developing an approach to implement this initiative.
Coordinating federal emergency management
7.44 As stipulated in the Emergency Management Act, Public Safety Canada should ensure that its coordination role for the federal response to an emergency is well-defined and that the operational policies and plans that departments will follow are updated and consistent. (7.33–7.43)
The Department’s response. Agreed. Public Safety Canada will maintain the Federal Emergency Response Plan and its components as an evergreen document. This includes ensuring the development of policies and event-specific plans that outline operational protocols and departmental roles and responsibilities, and reviewing these plans to ensure a coordinated approach as necessary.
Protecting critical infrastructure
7.61 Based on the responsibilities outlined in the Emergency Management Act, Public Safety Canada should provide policies and guidance for departmental sector heads to determine their infrastructure and assess its criticality, based on risk and its significance to the safety and security of Canadians; it should establish policies and programs to prepare plans to protect the infrastructure. (7.48–7.60)
The Department’s response. Agreed. Based on the responsibilities outlined in the Emergency Management Act, Public Safety Canada will provide tools and guidance for sectors to determine their processes, systems, facilities, technologies, networks, assets, and services. Public Safety Canada will also provide tools and guidance for departmental sector heads to assess the infrastructure’s criticality based on risks and its significance to the safety and security of Canadians, and will establish policies and programs to prepare plans for their protection.

Definitions:
Capability gap—The gap between available resources and the desired result, which in this case is a timely and effective response to an emergency. (Return)
First responders—The police officers, firefighters, and emergency medical service workers who are the first to respond to an emergency. (Return)

Monday, November 2, 2009

The number of refugees gaining asylum in Canada has dropped dramatically under the Conservatives as new figures reveal the impact .

The number of refugees gaining asylum in Canada has dropped dramatically under the Conservatives as new figures reveal the impact of the government's efforts to transform this country's immigration system.
New statistics released by the government show the number of successful claims by refugees living in Canada fell to less than half of what it was when the Conservatives came to office.
The final immigration numbers for 2008 – as well as future projections – come as Citizenship and Immigration Minister Jason Kenney is promising to refocus Canada's refugee system on what the government calls “real victims” rather than migrants seeking to abuse the process.
During the summer, the government imposed visa restrictions on Czechs and Mexicans as part of a broader attempt to block bogus refugee claims filed from within Canada. A spokesman for Mr. Kenney noted Monday that Mexico was the top source of asylum claims in 2008, yet the Immigration and Refugee Board rejected those claims at a rate of 90 per cent.
Spokesman Alykhan Velshi said the department expects that in 2010, Canada will resettle 3,900 refugees from Iraq, 2,900 Karen refugees from Burma and 2,500 Bhutanese refugees.
Critics say the numbers show a lack of compassion and a potential disregard of the government's obligations under the Charter of Rights and Freedoms to provide people with a fair hearing before deciding whether to deport them. They say lower targets will mean thousands of refugee claimants living in Canada will face further delays in hearings, and more will be deported to a very uncertain future.
“If we deport the wrong person because we denied their claims, some of them suffer torture, beatings, occasional death … drastic consequences,” NDP MP Olivia Chow said. She pointed to a report last month that a 24-year-old woman who was murdered in Mexico had made two failed refugee claims in Canada.
Citizenship and Immigration's annual report, released on Friday afternoon, revealed that the number of refugees approved after applying in Canada dropped by 56 per cent from 2005 to 2008.
The report also shows the projected number of refugees who will be accepted from within Canada – known as “inland protected persons” – will remain near the lower 2008 levels both this year and next.
The lower numbers reflect the fallout of a refugee determination system that slowed to a crawl in the first two years after the Conservative defeat of the Paul Martin Liberal government in 2006.
The Conservatives vowed to overhaul the Immigration and Refugee Board – the panel that rules on refugee claims – saying it had become a haven for Liberal political appointees.
But in the move to a new system, many board positions were vacant for months, swelling the case backlog and limiting the number of hearings. There is now an 18-month delay between a refugee claim in Canada and an IRB hearing.
According to the minister's spokesman, the time it took to change the IRB appointments process is the main reason for the drop in the 2008 numbers. But now that almost all of the board positions have been filled, he said, the numbers will climb again in the short term.
However, Mr. Velshi said the report's projections do not take into account the minister's plans for a new system that will weed out “bogus” claims made in Canada more quickly while still respecting the Charter.
“Clearly our system is being abused,” Mr. Velshi said. “[The minister] plans to reform our asylum system to give a faster decision to real asylum claimants.”
Janet Dench, the executive director of the Montreal-based Canadian Council for Refugees, said the report's numbers show a clear change in Canada's approach to refugees.
“Canada is becoming dramatically less welcoming toward refugees,” said Ms. Dench, who takes issue with the government's assertions that it is showing an openness to refugee applications from abroad. “It's a very bleak, bleak picture for refugees and for Canadians that care about refugees.”