0% found this document useful (0 votes)
32 views9 pages

Cancer Care Ontario and Integrated Cancer Programs: Notes From The Field

Cancer Care Ontario (CCO) has implemented a performance management system for monitoring 11 integrated cancer programs (ICPs) in Ontario, focusing on accountability and continuous improvement. Key lessons learned include the importance of valid data, the need for mutual commitment in performance reviews, and the benefits of streamlined reporting. The system aims to enhance cancer care services and supports the provincial government's commitment to healthcare improvement.

Uploaded by

2023626906
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views9 pages

Cancer Care Ontario and Integrated Cancer Programs: Notes From The Field

Cancer Care Ontario (CCO) has implemented a performance management system for monitoring 11 integrated cancer programs (ICPs) in Ontario, focusing on accountability and continuous improvement. Key lessons learned include the importance of valid data, the need for mutual commitment in performance reviews, and the benefits of streamlined reporting. The system aims to enhance cancer care services and supports the provincial government's commitment to healthcare improvement.

Uploaded by

2023626906
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

The current issue and full text archive of this journal is available at

www.emeraldinsight.com/1477-7266.htm

NOTES FROM THE FIELD CCO and


integrated cancer
Cancer Care Ontario and programs
integrated cancer programs
335
Portrait of a performance management system
and lessons learned
Siu Mee Cheng
Cancer Care Ontario, Toronto, Canada, and
Leslee J. Thompson
Medtronic of Canada Ltd, Mississauga, Canada

Abstract
Purpose – A performance management system has been implemented by Cancer Care Ontario (CCO).
This system allows for the monitoring and management of 11 integrated cancer programs (ICPs)
across the Province of Ontario. The system comprises of four elements: reporting frequency, reporting
requirements, review meetings and accountability and continuous improvement activities. CCO and
the ICPs have recently completed quarterly performance review exercises for the last two quarters of
the fiscal year 2004-2005. The purpose of this paper is to address some of the key lessons learned.
Design/methodology/approach – The paper provides an outline of the CCO performance
management system.
Findings – These lessons included: data must be valid and reliable; performance management
requires commitments from both parties in the performance review exercises; streamlining
performance reporting is beneficial; technology infrastructure which allows for cohesive management
of data is vital for a sustainable performance management system; performance indicators need to
stand up to scrutiny by both parties; and providing comparative data across the province is valuable.
Critical success factors which would help to ensure a successful performance management system
include: corporate engagement from various parts of an organization in the review exercises; desire to
focus on performance improvement and avoidance of blaming; and strong data management systems.
Practical implications – The performance management system is a practical and sustainable
system that allows for performance improvement of cancer care services. It can be a vital tool to
enhance accountability within the health care system.
Originality/value – The paper demonstrates that the performance management system supports
accountability in the cancer care system for Ontario, and reflects the principles of the provincial
governments commitment to continuous improvement of healthcare.
Keywords Performance management, Performance management systems, Cancer,
Continuous improvement, Performance monitoring, Canada
Paper type Case study

Introduction
Following the Cancer System Integration Committee Report, Cancer Care Ontario (CCO), Journal of Health, Organization and
a provincial agency, assumed a new role and identity in the cancer care health system. Management
Vol. 20 No. 4, 2006
It no longer provided direct patient care, but instead assumed a leadership and strategic pp. 335-343
advisory role to the Ministry of Health and Long Term Care on cancer care issues. q Emerald Group Publishing Limited
1477-7266
Additionally, CCO divested its direct patient care services to 11 host hospitals, and DOI 10.1108/14777260610680131
JHOM engaged in contractual relationships with these hospitals (Durham, Grand River,
20,4 Hamilton, Kingston, London, Northeastern Ontario, Ottawa, Peel, Toronto-Sunnybrook,
Thunder Bay, and Windsor) to provide certain patient care services, radiation therapy
services and systemic therapy services, through the creation of integrated cancer
programs (ICP) at each of the hospitals. These ICPs are the divested cancer centres
operated by CCO prior to the divestment, and which now are housed within the host
336 hospitals. A critical feature of the agreement is the protection of funds directed to the
ICPs. Funds flowed for the operation of the ICPs may not be used by the host hospital,
nor can it be assumed under the hospital’s global budget (Sullivan et al., 2003, 2004;
Thompson and Martin, 2004). As a result, accountability of ICP success is implicit.
Although at first glance, it may appear that CCO is a flow-through medium for
funds coming from the Ministry of Health and Long-Term Care, in fact, CCO plays
amore critical role. As a leader in the cancer care health system in Ontario, it is
committed to ensuring accountability to its many stakeholders, quality improvement
and advancing patient cancer care. This commitment has translated itself to the
development and adoption of a comprehensive performance management system. This
system has enabled CCO and the ICPs to better manage services provided through the
existing service agreements, and has created opportunities to tie government funding
for health services to service delivery and quality.
CCO’s performance management system is a comprehensive system, which is
comprised of four elements to create a system that focuses on performance
improvement and avoids the blame-laying game:
(1) reporting frequency;
(2) reporting requirements;
(3) (joint) performance review meetings; and
(4) continuous performance improvement.

These four elements allow the performance management system to serve a purpose
beyond monitoring performance measure results. The elements create a framework
where relevant information is fed back to the appropriate stakeholders to facilitate
decision and control processes, creating a “closed-loop deployment” (Pun and White,
2005).

Elements of the performance management system


Reporting frequency
CCO has created a performance management reporting system that requires each
hospital’s ICP to report on performance results against various performance domains
and performance indicators every quarter of a fiscal year. Reporting at each quarter is
tied with the need to risk manage performance across the system, based on timely and
accurate data. Each quarter’s reporting emphasis is different, although every quarter
reports consistently on some key performance domains.
Quarter 1.
.
Performance concerns on performance are not as critical at this early stage in the
fiscal year. The intent of this quarter is to identify any initial strategic challenges
that may impact performance results for the remainder of the fiscal year.
.
ICPs are required to report primarily on variances against key indicators, and CCO and
improvement plans to correct performance in the remainder of the fiscal year. integrated cancer
Quarter 2 – interim. programs
. Emphasis on performance is greater in this quarter. There is interest identifying
year-to-date performance results and year-end forecasts, that may provide early
indications that negotiated targets may not be met by year-end. Where there is 337
such an indication, in-year adjustments are made to individual ICP targets.
Funding is reallocated elsewhere in the system, to ensure that provincial service
targets will be met by year-end.
.
In addition to a focus on the key indicators, this quarter requires that each
Regional Vice President (RVP) report on Regional Cancer Program (RCP)
development status. Under the agreements between CCO and the hospitals,
RVPs are designated to the ICPs, and have dual accountability to both the
hospital CEOs and to CCO. In addition to managing the ICPs, the RVPs are also
held accountable for building regional cancer care capacity. RVPs need to ensure
that the region that they are designated to is able to ensure access to a full
spectrum of cancer care services, via development of a mature RCP. The intent of
the RCP is to ensure the accessibility of a full spectrum of cancer services within
a defined geographic region (i.e. the Local Health Integration Networks recently
defined by the Government of Ontario), and not necessarily the actual direct
delivery of a full spectrum of a full spectrum of cancer services.
.
In addition to reporting on progress against each region’s RCP, RVPs must
report on progress against the Cancer 2020 Plan targets for their region. This
plan outlines cancer prevention and early detection targets that the province is to
achieve by 2020.
.
Preliminary business planning information is also obtained to aid in planning for
appropriate service target levels for the next fiscal year.

Quarter 3.
(1) This quarter’s foci are:
.
performance against the key indicators and making any adjustments to
service targets in order to manage system performance; and
.
ensuring that all on-going and annual requirements as outlined in the
agreements between CCO and the ICPs are being met by both parties.
(2) Further refinements to early projections on possible performance targets for the
next fiscal year are also made in this quarter.

Quarter 4 – year-end.
.
The emphasis in this final quarter is similar to that seen in Quarter 2.
.
This quarter also looks at confirming final volume targets for next fiscal year.

Reporting requirements
The following will further detail the performance reporting domains that the ICPs are
required to report on, depending on the fiscal year quarter. Standard reporting
JHOM templates have been created and allow for consistent performance reporting by the
ICPs for each quarter. These templates are pre-populated with data that are provided
20,4 by the ICPs on a monthly basis to CCO through various IT reporting systems. Reports
are pre-populated with cumulative quarter-ending data for key performance measures.
These measures reflect two categories of measures: performance measures, which are
tied to contractual agreements. These measures consist of a mix of input, output and
338 efficiency measures: service volumes, operating expenses, and wait times for cancer
surgery, and radiation and systemic therapy services for four key specialty areas.
ICPs are expected provide an analysis against variances from negotiated performance
indicator targets, and to provide improvement plans to address variances where required,
with accompanying implementation timelines.
These pre-populated reports require reporting in the following performance
domains:
.
Radiation therapy services: service volumes and quality issues.
.
Systemic therapy services: service volumes and quality issues.
.
Cancer surgery services: service volumes and quality issues.
.
Ontario breast screening program: service volumes.
.
Financial performance.
.
Data reporting performance.
.
RCP development: status against required milestones to implement a mature
program, and status against initiatives that advance performance in various
areas (radiation therapy, systemic therapy, cancer surgery, Ontario breast
screening, and end-of-life care services).
Joint performance review meetings
Following submission of completed reports by the ICPs, which outlines variance
analysis and improvement plans to address performance variances, representatives
from each ICP will meet with CCO, to discuss in greater depth, issues raised by the
performance reports. These joint review sessions are formally structured, with a
designated Performance Review Panel assembled by CCO, comprising of provincial
representatives with specialization in each of the performance domains monitored
within the performance management system. Consequently, the panel is intended to
provide both a multi-disciplinary, as well as, an interdisciplinary approach to
performance management. Panel members are provided with all reports prior to the
joint review sessions with the ICPs to allow for pre-analysis and discussion of key
performance issues, outliers (excellent and poor performers), and common performance
management themes that may require support from a systems level.
During each joint review session, ICPs are given an opportunity to present their
performance results for each quarter, and to express their concerns. In turn, the panel
will request for greater details on reported data and improvement from ICPs. These
sessions also act as a forum for ICPs to request for specific or general support from
CCO towards addressing performance problems in order to improve future results.
This is critical, particularly where the source for addressing a performance problem
may be beyond the sphere of influence of an ICP. The panel, in turn, may request ICPs
to commit to various improvement actions or strategies in order to address concern
areas.
Continuous performance improvement CCO and
The first three elements of the performance management system primarily focuses on integrated cancer
performance reporting. The continuous improvement element involves on-going
follow-up of improvement plans provided by ICPs in their reports and performance programs
commitments arising from the quarterly review sessions. Each quarter, ICPs are
requested to provide a status on the improvement plans committed to in the previous
quarter. 339
On-going management of performance is tied to a four-step escalation process, as
outlined in the cancer services Agreements between CCO and the ICPs. The performance
management four-step escalation process begins with ongoing monitoring of problems,
and culminates, when no improvement to a problem is observed over a given time
period, in the reallocation of funds to other regions of the province.
This last component of CCO’s performance management system is in ensuring that
the system is not only focused on performance reporting, but also on actual
performance improvement, which is an explicit integral part of the system’s intent.
This is pivotal in ensuring that the system remains sustainable and relevant to the
cancer system. As articulated by Brown et al. (2005) and KPMG’s Assurance and
Advisory Services (2001), performance reporting and measurement needs to be
accompanied by actions to improve performance and achieve desired results. Although
the continuous performance improvement element of the system may require
additional effort and resources by CCO, it allows CCO and the cancer system to be
proactive, rather than reactive in its management of the cancer system.

Lessons learned
CCO and the ICPs have completed five quarterly review exercises under the umbrella
of the performance management system. The system has been successful as evidenced
by the evolving quality of discussions that have occurred in the joint review sessions
and the quality of information reported back from ICPs to CCO. Initially, the quarterly
review exercises tended to focus on the structures and components of the performance
management system (i.e. quality of templates, quality of data, quality of performance
indicators), however, this has evolved. The focus by both parties, ICPs and CCO, has
changed towards primarily on real performance improvement efforts (i.e. systems
supports to help reduce wait times). Along this evolution, CCO has learned invaluable
lessons that could have relevance and application to other organizations similarly
committed to, and engaged in, organizational and systems performance. This
performance management system is of particular relevance to those organizations
engaged in similar contractual relationships and systems responsibilities and
accountabilities.
The following is a list of some critical (early) lessons learned:
.
Ensure that data is valid and reliable. When data quality is suspect, much of the
focus is spent on data quality, rather than focusing efforts and discussions
around performance improvement. However, this does not negate the need for
reporting, even if poor quality data is all that is available for reporting. CCO has
observed in its own experiences that reporting and holding those reporting the
data accountable for data quality, can, over time improve data quality.
Nevertheless, caution is required when forming judgments on performance,
based on poor data quality.
JHOM .
Performance management is a two-way street for CCO. On the one hand, CCO is
20,4 holding individual ICPs and RVPs accountable for various performance issues,
CCO is also accountable for ensuring that there are adequate and appropriate
systems supports available to the ICPs and RVPs in order to achieve
performance goals.
.
Streamlining all performance reporting has created immeasurable benefits for
340 both CCO and the ICPs. The combined data and information provides a
comprehensive picture of each ICP’s performance and has enabled better
planning to improve performance, as well as, reveal gaps in performance for
further investigation. From the CCO perspective, the streamlined performance
reporting has enabled CCO to address performance from a systems perspective,
to assess performance issues more strategically, both from the perspective of
geography as well as by services (i.e. impact of multi-disciplinary approaches to
care on radiation and systemic therapy service demands). Moreover, the
deliberate effort of streamlining all performance reporting into one system has
aided in optimising efficiency and reducing duplication of efforts within CCO and
the ICPs.
.
A performance reporting system based on many data sources, with varying data
quality checks and standards for those sources can be prohibitive in ensuring a
successful management system. Infrastructure that provides for cohesive and
streamlined data management is compulsory in ensuring the sustainability of a
comprehensive performance management system. Continuous performance
improvement is heavily reliant on data that reflects performance at various time
periods to determine performance trends. Consequently, data sets may become
very large and unwieldy unless the IT infrastructure allows for easy
management, analysis and use.
.
Ensure that there is agreement on performance indicators that will be monitored
and reported for performance management purposes. These indicators should
reflect the SMART principles, with a particular emphasis on validity. Where
indicators are perceived as being unaligned to relevant systems goals and
objectives, performance review exercises can be uninspiring and lacking in real
improvements in problem areas. Kennerley and Neely (2003) have indicated that
measures should reflect current realities, and where they do not, should be
discarded.
.
Providing comparative data, which outlines performance of individual
organizations against their peers can be beneficial and valuable. Although
literature currently reveals that there is debate over the benefits of public
performance reporting, in CCO’s performance management experiences,
the provision of comparison data offers up an opportunity for individual
organizations to assess their performance against their peers and accepted
systems benchmarks. This has acted in part to incent for relative performance
improvements (Brown et al., 2005; Sullivan et al., 2005; Werner and Asch, 2005;
Galvin and McGlynn, 2003). Additionally, the comparative information allows
for the means to identify performers who can participate in knowledge exchange
exercises (i.e. assembling poor and best performers to share in best practices and
lessons learned).
Critical success factors CCO and
The lessons learned have led to the realization that there are some critical success integrated cancer
factors that must be an integral component of any performance management system:
.
Quality of data is critical in ensuring a successful performance management
programs
system. Where the quality of data is called into question, the integrity of the
system is also questioned, and the effectiveness of the system in improving
organizational and system performances diminishes as support and enthusiasm 341
wanes.
.
Timely reporting and data management, based on a supportive information
technology system that allows for cohesion of data sources, data manipulation,
data reporting and easy access is critical to a performance management system.
.
Organizational engagement is a critical factor. Where the performance
management system addresses various performance domains, corporate
representatives for the various performance domains must be fully engaged
and participating in the performance management exercises (i.e. CFO for
financial performance, CIO for data reporting performance, VP of clinical
services for clinical care performance, and, etc.).
.
Participants need to engage in the review exercises with a performance
improvement perspective. It is key to ensuring that the system is built on the
foundation of performance improvement, and not blame-laying. The CCO
system’s early successes have been due to a real desire to improve future
performance, which aids in reducing antagonism and animosity amongst the
relevant players. This reduces the likelihood that the system will stagnate and
become irrelevant (Neely, 2003).

Limitation of CCO performance management system


The current CCO performance management system, while comprehensive and
encompassing in many ways, does possess some limitations:
.
The system does not manage a set of measures, which adequately reflects overall
systems performance. Current performance measures do not provide a
satisfactory picture of ICP performance in its entirety, particularly in the
context of systems and contractual goals and objectives. In part, all measures
monitored under the system are process-oriented measures, primarily output
measures and wait time measures. As the performance management system
continues to evolve, the adoption of outcome-oriented measures will be integral.
Additionally, a more balanced suite of performance measures is lacking in the
current CCO system. Although there is breadth in the performance domains
currently being managed under the system, it may not be adequately
comprehensive. The current performance domains are not entirely reflective of
those discussed in the wealth of literature around balanced performance
measurement (Kaplan and Norton, 1992; KPMG’s Assurance and Advisory
Services, 2001; Dalton, 2002; Brown et al., 2005). To address this limitation will
require much effort and commitment from both CCO and the ICPs due to the
implications of agreeing to manage against a balanced set of performance
measures, and may change various aspects of the relationship between the two
parties.
JHOM .
Compounding the issue of appropriate and relevant performance measures, is the
20,4 lack of accepted targets for some of the performance measures currently under
the performance management system. CCO has yet to tie quantitative targets to
clinical care quality-related performance measures. As a result, the ability to
demonstrate incremental improvements in the system on an annual basis is
difficult.
342 .
Data issues, as discussed above are still an on-going issue. The balance between
timely and accurate data is an on-going struggle.
.
The CCO management system has yet to mature into a system that rewards
better than expected performance results. Its focus is currently managing
performance risks. Those ICPs that have performed better than their negotiated
targets are not materially rewarded. CCO does reward them indirectly by
acknowledging the excellent performers as gold standards within the system,
and learning from their critical success factors during knowledge exchange
exercises.

Future focus for improvements


CCO has identified areas for immediate improvements:
.
The development of a cohesive and mature IT infrastructure that will support
more sophisticated and complex data management for performance management
purposes.
.
Data quality improvement will be an on-going focus in the system, for both CCO
and the ICPs. A data quality framework will be developed and implemented,
which encourages data quality at all critical data management points.
.
CCO and the ICPs will be developing a suite of relevant measures, and tying
targets to these measures.

Conclusion
CCO, along with its partners, will continue to seek to improve and further strengthen
system performance in cancer care, both in service delivery and quality of services
delivered, as a means to improving cancer health outcomes among patients using
various means. One performance improvement strategy is the performance
management system, which has enabled CCO to assume a more proactive role in
health care system management beyond that of a flow-through funding agency of the
government. The system allows for a “no-surprise” approach in managing system
performance, and enables CCO to assume a stewardship role in the area of cancer
service delivery and quality. This feature further emphasizes the uniqueness of the
cancer care system in Ontario. In an era where accountability of public services is at
the forefront of the health care system landscape, the need to demonstrate and ensure
accountability is essential. The performance management system supports
accountability in the cancer care system for Ontario, and reflects the principles of
the provincial government’s commitment to continuous improvement of healthcare
(Ministry of Finance, 2005).
References CCO and
Brown, A.D., Bhimani, H. and MacLeod, H. (2005), “Making performance reports work”, integrated cancer
Healthcare Papers, Vol. 6 No. 2, pp. 8-22.
Dalton, J. (2002), “Strategic score-keeping”, Association of Management, Vol. 54 No. 6, pp. 53-7.
programs
Galvin, R.S. and McGlynn, E.A. (2003), “Using performance measurement to drive improvement:
a roadmap for change”, Medical Care, Vol. 41 No. 1, pp. I-48-I-60.
Kaplan, R.S. and Norton, D.P. (1992), “The balanced scorecard – measure that drive 343
performance”, Public Administration Review, Vol. 52 No. 1, pp. 71-9.
Kennerley, M. and Neely, A. (2003), “Measuring performance in changing business
environment”, International Journal of Operations & Production Management, Vol. 23
No. 2, pp. 213-29.
KPMG’s Assurance and Advisory Services Center (2001), “Achieving measurable performance
improvement in a changing world: the search for new insights”, KPMG LLP, available at:
www.kpmg.com
Ministry of Finance (2005), “2005 Ontario budget: investing in people, strengthening our
economy, budget speech”, prepared by G Sorbara, Queen’s Printer for Ontario, Toronto,
available at: www.ontariobudget.fin.gov.on.ca
Neely, A. (2003), “Performance measurement”, New Straits Times, 30 August, p. 1.
Pun, K.F. and White, A.S. (2005), “A performance measurement paradigm for integrating
strategy formulation: a review of systems and frameworks”, International Journal of
Management Reviews, Vol. 7 No. 1, pp. 49-71.
Sullivan, T.S., Dobrow, M., Thompson, L.J. and Hudson, A. (2004), “Reconstructing cancer
services in Ontario”, Healthcare Papers, Vol. 5 No. 1, pp. 69-80.
Sullivan, T.S., Evans, W., Angus, H. and Hudson, A. (Eds) (2003), Strengthening the Quality of
Cancer Services in Ontario, Canadian Healthcare Association Press, Ottawa.
Sullivan, T.S., Greenburg, A., Sawka, C. and Huson, A. (2005), “A just measure of patience:
managing access to cancer services after Chaoulli”, in Flood, C.M., Roach, K. and Sossin, L.
(Eds), The Legal Debate Over Private Health Insurance in Canada, University of Toronto
Press, Toronto, available at: www.cancercare.on.ca/docuemnts/AjustMeasureofPatience.
pdf
Thompson, L.J. and Martin, M.T. (2004), “Integration of cancer services in Ontario: the story of
getting it done”, Healthcare Quarterly, Vol. 7 No. 3, pp. 42-8.
Werner, R. and Asch, D.A. (2005), “The unintended consequences of publicly reporting quality
information”, JAMA, Vol. 293 No. 10, pp. 1239-44.

Corresponding author
Siu Mee Cheng can be contacted at: [email protected]

To purchase reprints of this article please e-mail: [email protected]


Or visit our web site for further details: www.emeraldinsight.com/reprints

You might also like