Article Text

Download PDFPDF

Practice-based collection of quality indicator data for a comprehensive quality assessment programme in Canadian family practices
Free
  1. David Price,
  2. Michelle Howard,
  3. Lisa Dolovich,
  4. Stephanie Laryea,
  5. Linda Hilts,
  6. Angela Barbara
  1. Department of Family Medicine, McMaster University, Hamilton, Canada
  1. Correspondence to Ms Michelle Howard, Department of Family Medicine, McMaster University, 175 Longwood Rd. S. Ste 201A Hamilton, ON, Canada L8P 0A1; mhoward{at}mcmaster.ca

Abstract

Introduction Quality improvement in primary care can be facilitated by the ability to measure indicators in practice. This paper reports on the process and impacts of data collection on indicators of a quality assessment tool in seven interprofessional group family practices in Ontario, Canada.

Methods The programme addressed indicators and collected data across multiple domains of practice including clinical quality, physical factors, and patient and staff perceptions. A system audit of the practice, a patient survey, a staff satisfaction survey and chart audits (on hypothyroidism and hyperlipidaemia) were designed to measure selected indicators across the domains. Practices were trained and collected their own data. Practices provided feedback on the process and impacts during a postprogramme workshop and on a survey 1 year later.

Results Four-hundred charts audits were completed for each of hyperlipidaemia and hypothyroidism, 319 patient satisfaction surveys were administered in four practices, and the staff satisfaction survey was completed by 77 staff in six practices. Most practices demonstrated indicators of privacy, access and safety. There was more variability in indicators relating to staff professional development and team involvement in meetings. Patient satisfaction with providers was rated highly, whereas some aspects of practice access were rated lower. Practices approached the challenge of participation by engaging multidisciplinary team members and dividing tasks. Most practices reported continued participation in various quality improvement initiatives 1 year later.

Conclusions Using a set of indicators, structured processes and training, family practices find the process of gathering and reviewing their data useful for quality improvement.

  • Quality indicators
  • primary healthcare
  • patient satisfaction
  • chronic diseases
  • continuous quality improvement
  • family medicine

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Introduction

Family practice and primary care over the next 20 years promise to be much more complex than in the preceding 50 years. For example, family medicine is increasingly being delivered by teams of healthcare providers with family physicians being an integral part of this team, and information technology will assist us in caring for our patients.

It is well known that there is room for improvement in primary care for issues such as preventive screening and chronic disease management, and all levels of healthcare are re-evaluating how they provide care.1 Quality improvement and assessment reports in the literature are much more common in the hospital sector than in primary care.2 Formal quality programmes have been developed in primary care, in several countries including Australia,3 ,4 New Zealand,5 the UK6–8 and Europe,9 regions that have been leaders in establishing indicators and programmes to accelerate primary care quality improvement. While there has been major investment in reforms to primary care in Canada in the past decade,10 ,11 Canada does not have these types of programmes currently. Governments and society are now expecting accountability, effectiveness and sustainability from the healthcare system. In Canada, new models have moved away from solo-physician practice to support the addition of nurses and other health professionals, and preventive care, care of patients with chronic disease and mental illness and greater access are now expected and remunerated specifically.12

Since 2003, we developed the ‘Quality in Family Practice’ (Quality) programme tool and programme, comprising a comprehensive set of quality indicators based on the tools developed and validated in the UK,6–8 Australia,4 ,13 New Zealand14 and Europe15 and modified for the Canadian practice setting. We designed and pilot-tested a programme for family practices to implement continuous quality improvement using the tool,16 and created and implemented practice-based data collection instruments and methods for indicator measurement. The objective of this study was to explore the ability of family practice teams in Canada (Family Health Teams: http://www.health.gov.on.ca/transformation/fht/fht_mn.html) to collect data on various aspects of their practice for measurement of quality indicators and to examine the perceived impact of the programme.

Methods

Programme

The programme details are described in detail elsewhere (http://www.qualityinfamilypractice.com). Briefly, representatives of practices participated in a start-up workshop to introduce and orient them to the programme. Practices were linked to an external advisor for support, and the advisor conducted a preassessment visit and completed a report. Advisors were trained primary healthcare professionals with leadership experience. Practices undertook their own self-assessment at the end of the programme, and a team of external assessors reviewed the self-assessment and visited the practice to give feedback. The assessment team comprised trained interdisciplinary healthcare professionals and a lay member.

Assessment tool

The tool covers five key areas: (A) factors affecting patients; (B) physical factors affecting the practice; (C) clinical practice systems; (D) practice and patient information management; and (E) continuous quality improvement (CQI), continuous professional development (CPD) and quality of work life. The entire tool was available for this field test, but we limited the indicators for measurement due to the time frame of this project. We used a structured process to choose a subset of indicators from all areas of the tool for measurement. Two authors (MH, LD) created a scheme to categorise every criterion in the tool and independently rated criteria in terms of expected area of impact in the practice, likelihood of change in a short time, type of data collection required and difficulty of implementation, while including indicators from all five sections of the tool. The two raters resolved discrepancies by consensus. The tool was thus narrowed to 32 mandatory indicators for measurement by selecting criteria that were measureable and feasible to implement, and would be expected to have an impact in a short time frame (table 1).

Table 1

Indicators (non-clinical) addressed by the practices, and the data-collection method used

For several indicators in Sections A, B, D, and E, a practice system audit form was created. The form was accompanied by a guide with web links on relevant policies and procedures. The system audit was pilot-tested by staff in an academic group family practice affiliated with McMaster University, and the practice manager carefully reviewed the audit tool and suggested changes to improve clarity and completeness. Some indicators were also evaluated by a patient satisfaction survey designed by the project team and distributed by practices to patients in the waiting room, with a target of 100 patients. The patient survey was adapted from published instruments; most questions were taken from the Physician Achievement Review programme of Alberta Canada (http://www.par-program.org/), the Improving Practice Questionnaire, and the General Practice Assessment Questionnaire from the UK Client Focused Evaluation Program, the Building Healthier Organisations Program (http://www.bhogroup.com/tools.asp) and the Practice Enhancement Program of Saskatchewan Canada (http://www.pepsask.ca/). To evaluate the quality of the work environment, a Quality of Work Life Survey17 was administered to staff. Anonymous staff and patient surveys were collected at the practices and returned to us by post.

Two indicators were chosen from Section C: management of hypothyroidism, and hyperlipidaemia (within the coronary heart disease management criteria). These were chosen as common conditions with known targets for management. Clinical chart audits were conducted by the practices. Chart audit forms and an instructional guide and training package were developed by the project team. Practices were asked to audit the charts of 50 patients with hypothyroidism and 50 with hyperlipidaemia, and were taught to use a random number table for chart selection. Each practice identified one to two individuals, usually a nurse or nurse practitioner, and the practice manager to conduct the chart audits. The chart audit for hypothyroidism captured age, gender, dates and readings of thyroid-stimulating hormone (TSH) in the past year and medications used. The audit of hyperlipidaemia patients included age, gender, dates and readings in the past year for lipids panels, blood pressure readings in the past year, and documentation of coronary artery disease, cerebrovascular disease, peripheral arterial disease, diabetes, chronic kidney disease, smoking status and the mention of diet and exercise advice in the past year.

Participants

Seven interprofessional group family practices participated from June 2007 to March 2008. Practices were recruited from a workshop in March 2007 given by the project team, as well as through targeted recruitment of practices in specific geographic areas. The number of physicians in the group practices ranged from 3 to 16 (mean=8). Five practices had electronic medical records, and two had paper charts. Six operated out of one location, and one had numerous sites, of which four participated. The number of non-physician health professionals ranged from 3 to 19. Practice sizes ranged from 5000 to 26 000 patients.

Procedures

Data collection took place from September 2007 to February 2008. Completed data-collection forms and surveys were forwarded to the project team office for analysis.

Feedback from the practices was obtained from a wrap-up workshop held at the end of the programme in which all practices were represented by key individuals who implemented the programme. Debriefing by practices was recorded and transcribed. A survey was also sent to the manager and lead physician at each practice 1 year after the project, asking about continuing quality initiatives since the programme, and the perception of value of the programme (not at all, somewhat, extremely valuable).

The project was approved by the Hamilton Health Sciences/McMaster University Faculty of Health Sciences Research Ethics Board.

Results

There were 10 system audits completed because one group practice had four participating offices, requiring separate audits. Across all practices, 400 chart audits were completed for each of hyperlipidaemia and hypothyroidism patients. Patient satisfaction surveys were administered to a convenience sample of 319 patients in four practices (not all practices administered this survey). The Quality of Work Life Survey was completed by 77 staff in six practices. One practice chose to distribute it to nursing staff only. Since the project team did not administer this survey or have information on the sampling frame or response rate to assist with interpretation, these results have not been presented here.

Table 2 shows a sample of results of the system audit of practices. Most practices demonstrated indicators of privacy, access and safety. There was more variability in indicators relating to staff professional development and team involvement in meetings. For example, all practices reported that patients were informed about how to access care after-hours, and all reported password protection on computers that access health records. Half or fewer of the practices reported regular meetings of the entire team or having a record of CPD.

Table 2

Selected results of the system audits of the seven practices

Figures 1, 2 show selected results of the chart audits. The proportion of hypothyroidism patients across the seven practices who were taking thyroid replacement medication and whose most recent TSH reading was >5.0 mU/l ranged from 8% to 26%. The proportion of patients with hyperlipidaemia whose most recent LDL-C and total cholesterol:HDL-C ratio in the past year were within range for their Framingham risk level ranged from 20% to 79%.

Figure 1

Proportion of hypothyroidism patients across the seven practices who were taking thyroid replacement medication and had a thyroid-stimulating hormone (TSH) reading of >5.0 mU/l.

Figure 2

Proportion of patients with hypothyroidism whose most recent LDL-C and total cholesterol:HDL-C ratio in the past year was within range for their Framingham risk level.

Table 3 shows selected results of the patient satisfaction survey across the four practices that administered the survey. The mean age of respondents ranged from 44.2 to 52.6 years, with 63.4% to 82.9% being female. Fewer than 60% of patients agreed with the statement about adequate parking. Access to the building and timely appointments were also issues that showed variability. Satisfaction with providers was generally consistently high.

Table 3

Proportion of patients reporting ‘agree’ or ‘strongly agree’ with statements in the patient satisfaction survey questions related to indicators addressed in the field test

The main challenges identified in the wrap-up workshop were that allocating time to the programme was difficult, the full tool was large and daunting, and it was challenging to decide where to start, and practices were initially concerned with their ability to do chart audits. The main strategy to deal with these challenges was the formation of a subteam within the practice with representation ideally from all aspects of the practice (ie, administrative staff, physicians, non-physician healthcare providers) and to allocate parts of the tool to the most appropriate person. The successes noted were that in many practices, team meetings that involved staff from all roles in the practice began to happen for the first time, staff who had previously not perceived opportunities to contribute to practise decisions were able to take ownership of areas and have their voices heard, and areas for improvement in the electronic medical record became apparent. The usefulness of the data-collection activities was described in relation to confirming ‘hunches’ about clinical indicators and forcing policies and procedures to be written down for consistency. One practice noted that the patient satisfaction data provided quantitative confirmation of some of the issues that the group was aware of anecdotally, and provided support for moving forward with changes. All of the practices reported 1 year later that they continued to engage in quality improvement in the practice. Two practices rated the experience somewhat valuable, and five rated it extremely valuable.

Discussion

This programme demonstrated that volunteering family practices in Canada can mobilise to engage in a quality-improvement initiative, and can collect systematic data on indicators to measure their performance. Within 6 months, seven large family practices conducted chart audits on two clinical topics and surveyed their policies and procedures for many indicators, and most practices administered over 50 patient surveys in the waiting room.

Between practices, there was considerable variation in the control of hypothyroidism and hyperlipidaemia. There was less variation between practices on the system audit that examined policies and procedures. The system audit results suggested that practices, not surprisingly, address the most crucial aspects of the environment and care such as after-hours availability and safety of medications and vaccines. This is similar to the results of the European Practice Assessment instrument in 273 practices, where procedures and practices for infection control were present for over 85%, and continuous professional development and team meetings were reported by approximately half of practices.15

The multiple domains of the system audit and patient survey provide the opportunity for the practice to reflect on additional areas that go beyond clinical care issues. For example, there was high satisfaction on most questions of the patient survey, but the comparison across practices revealed very low satisfaction with availability of parking in one practice, and another practice had considerably lower ratings on patients' perceptions of being able to obtain an appointment in a reasonable time.

The programme was not designed to be a normative assessment against specified standards. The systematic collection of data on indicators assists the practice in understanding where there are areas for improvement and areas where the practice is already strong. For example, the data that arose from patient surveys provided concrete rather than anecdotal evidence of the need to make changes in the practice, and this type of evidence can be used as a catalyst for change.

The practices in this study were volunteers and likely higher performers than average, and thus we may not have obtained a range of the true values of indicators in the population. It is possible that some indicators that are not sensitive to differences between practices would not be revealed in this small group of practices. Further refinement of the tool for relevance and sensitivity has been under way in a separate project since 2008.

There were several limitations in this study that may have affected internal and external validity. Although most practices had an electronic medical record, audits were conducted on paper using forms provided, and results were analysed by the project team. This was done to standardise the data across different electronic and paper-based records systems in the practices in order to provide reasonable estimates of the indicators. This method may not be sustainable for busy practices in the long run. In addition, we did not verify the accuracy of the data collected by the practices.

The programme at this stage has no built-in mechanism for automatic data collection and analysis; however, other programmes in the US and Australia provide examples of national practice-based data collection on paper.18 ,19 Ideally, infrastructure would be created in primary healthcare systems to enable capture of data on clinical care processes and outcomes without overburdening practices. One example of organised quality improvement in Canada is a recent initiative in the province of Ontario through the Quality Improvement and Innovation Partnership funded by the provincial Ministry of Health, in which all Family Health Teams (approximately 150) are participating in learning collaboratives to institute quality improvement in diabetes care, colorectal cancer screening and office efficiency and access. This model has provided infrastructure that may assist with ongoing quality assessment (eg, a practice facilitator, data-collection protocols, training in PDSA cycles).

We cannot generalise these findings to practices that do not volunteer for such initiatives and maybe ‘lower performing.’ However, the programme is intended to be a viable solution to hesitant or lower performing practices, since it is entirely self-directed but supported by an experienced health professional advisor.

Conclusions

This programme has demonstrated that with structured processes and training provided, family practices can gather their own data for quality measurement and improvement. The process was initially daunting but was felt to lead to rewards of improved practice reflection and teamwork.

References

Footnotes

  • Funding Ontario Ministry of Health and Long Term Care, Toronto, Canada.

  • Competing interests None.

  • Ethics approval Ethics approval was provided by the Hamilton Health Sciences/Faculty of Health Sciences Research Ethics Board, Hamilton Canada.

  • Provenance and peer review Not commissioned; externally peer reviewed.