To assess the change in level of diabetes quality management in primary care groups and outpatient clinics after feedback and tailored support.
This before-and-after study with a 1-year follow-up surveyed quality managers on six domains of quality management. Questionnaires measured organization of care, multidisciplinary teamwork, patient centeredness, performance results, quality improvement policy, and management strategies (score range 0–100%). Based on the scores, responders received feedback and a benchmark and were granted access to a toolbox of quality improvement instruments. If requested, additional support in improving quality management was available, consisting of an elucidating phone call or a visit from an experienced consultant. After 1 year, the level of quality management was measured again.
Of the initially 60 participating care groups, 51 completed the study. The total quality management score improved from 59.8% (95% CI 57.0–62.6%) to 65.1% (62.8–67.5%; P < 0.0001). The same applied to all six domains. The feedback and benchmark improved the total quality management score (P = 0.001). Of the 44 participating outpatient clinics, 28 completed the study. Their total score changed from 65.7% (CI 60.3–71.1%) to 67.3% (CI 62.9–71.7%; P = 0.30). Only the results in the domain multidisciplinary teamwork improved (P = 0.001).
Measuring quality management and providing feedback and a benchmark improves the level of quality management in care groups but not in outpatient clinics. The questionnaires might also be a useful asset for other diabetes care groups, such as Accountable Care Organizations.
Introduction
To improve diabetes care, it might be important to focus on quality management (QM), especially because more health-care providers are involved and organizations are becoming more complex (1). QM comprises procedures to monitor, assess, and enhance the quality of care (2). Most studies on quality improvement strategies focus on performance indicators such as the number of patients having their HbA1c measured (3–5). A study in eight European countries showed high levels on process indicators and lower levels on intermediate outcome indicators (6). Care providers are increasingly obliged to reveal their results on performance indicators and use these results to improve their quality of care. However, it might be important to focus on QM on the organizational level as well because this could facilitate better quality of care.
Care for patients with type 2 diabetes has changed from acute reactive services to regular integrated management in the primary care setting (7). In the Netherlands, where type 2 diabetes prevalence is 5% (8), ∼85% of people with type 2 diabetes are treated by primary care family physicians in practices close to their homes (9). A few years ago, family physicians started working under the umbrella of care groups to improve the coordination of diabetes care in a well-defined region (10,11). These care groups involve 3–250 family physicians (12) and other contracted primary care providers such as dietitians, podiatrists, physiotherapists, and optometrists. They treat between 400 and 22,550 patients with type 2 diabetes. The concept of care groups is comparable to Accountable Care Organizations in the U.S. (13,14). As the main contractor of a bundled payment contract, care groups are fully responsible for the organizational arrangements for contracted diabetes care and its subsequent quality. The QM policy within care groups varies in the way these groups support self-management or provide refresher courses for associated health-care professionals. The growing role of care groups in improving performance indicators increases these groups’ need for good QM as well. Their QM on top of the QM in individual family practices is expected to be associated with better outcomes.
Patients who need more complex care are referred to endocrinologists practicing in diabetes outpatient clinics by their family physician, who acts as a gatekeeper in the Dutch health-care system. In these hospital-based outpatient clinics, endocrinologists hold the final responsibility for a diabetes team, which involves a diabetes nurse and dietitian. Specialists such as ophthalmologists, cardiologists, nephrologists, and a diabetic foot team can be consulted as well. Diabetes outpatient clinics treat between 250 and 4,500 patients with type 2 diabetes; one hospital is affiliated with one or two outpatient clinics (15). Both during and after office hours, patients with acute diabetes-related problems (mostly regarding insulin treatment) can immediately consult a family physician. Patients treated in outpatient clinics can also call a diabetes helpline organized by the hospital.
QM systems have been established within hospitals over the past decade (16). QM on an organizational level is likely to enhance the delivery of optimal diabetes care. For that reason, a National Diabetes Action Program has been developed in the Netherlands (17), and within the scope of this plan, we developed a stepwise, tailored intervention to be used by all organizations voluntarily and for free to improve QM. The intervention consisted of feedback and a benchmark on QM and access to a toolbox of quality improvement tools (step 1 of the intervention). If requested, additional support in improving QM was available, which consisted of an elucidating phone call (step 2a) or a visit from an experienced consultant (step 2b).
We report the change in QM of type 2 diabetes care in both care groups and outpatient clinics after the intervention. Moreover, we assessed the relationship between the steps of the intervention and the change in QM.
Research Design and Methods
Study Design
This study with a 1-year follow-up compared the levels of QM before and after an intervention to improve QM within Dutch care groups and outpatient clinics. No ethical approval was needed (18).
Study Population and Recruitment
In January 2012, managers responsible for diabetes care of all care groups (n = 97) and outpatient clinics (n = 104) across the Netherlands were invited by a personal e-mail and two reminders to fill out an online questionnaire. At baseline, 60 care groups (response rate 61.9%) and 44 responders on behalf of 52 outpatient clinics (response rate 50.0%) responded. All responders were invited in a similar way to fill out the same questionnaire again in May 2013.
Operationalization of QM
We developed online assessments for measuring QM: one for care groups and one for outpatient clinics (Supplementary Data). In these questionnaires, we balanced the perspectives of patients, care providers, insurers, and policymakers. The questionnaires contained six domains: 1) organization of care, 2) multidisciplinary teamwork, 3) patient centeredness, 4) performance management, 5) quality improvement policy, and 6) management strategies (score range 0–100%). Details on the development of the questionnaires have been described elsewhere (19,20). The mean score of these six domains is the total level of QM. Within the domains, 28 subdomains (score range 0–100%) were addressed. The questionnaires for care groups and outpatient clinics contained 59 and 57 questions, respectively. Each subdomain included one to six questions. Each question was given equal weight; all questions within a subdomain contributed X% to the score of a domain, where X was the mean weight given by a corresponding expert panel (18).
Intervention
The intervention consisted of two steps: 1) giving organizations feedback and a benchmark on their baseline level of diabetes QM combined with providing them access to a toolbox for improving QM and 2) offering the possibility of tailored support for improving specific aspects of the organization’s QM. Every responder received the first step of the intervention; the second step (see step 2: tailored support on request) of the intervention was optional.
Step 1: Feedback, Benchmark, and Access to Online Toolbox
Within 1 month after responding to the questionnaire, all responders were given feedback. Their results were presented in a radar diagram comprising the six QM domains and in a table elucidating the scores of the domains and subdomains (Supplementary Data). For each care group and outpatient clinic, these results were compared with the mean results of all responding care groups and outpatient clinics (benchmark) (19).
The feedback and benchmark were accompanied by an elucidating letter, and the responders received access to the online toolbox. In this toolbox, we added instruments for quality improvement and practical examples of good practices in diabetes care per quality domain. Participating organizations were also given the opportunity to share their own tools with other organizations. During the 1-year follow-up period, our website with the toolbox was monitored by Google Analytics. Reminders for updates of the toolbox were sent twice. Responders had access to the toolbox from 1 April 2012 to the start of the second questionnaire in May 2013.
Step 2: Tailored Support on Request
Three months after the feedback, all responders were telephonically asked about whether they had studied the feedback and benchmark, looked into the toolbox, or discussed their results within their organization. They were also asked whether they needed further support in improving their QM, and if so, tailored support was offered in two different ways. First, additional elucidation on the baseline results was accompanied by advice by telephone on how to start the QM improvement (step 2a), and second, the organization could be visited by an experienced consultant (step 2b). Before this visit, the consultant analyzed which gaps in QM could be tackled first. During the visit, the consultant made suggestions, offered supportive tools, or showed the responder how to initiate a quality improvement strategy. For this purpose, a dedicated time up to a maximum of 10 h per organization was available. The kind of support given and the time spent were registered by the study coordinator and the consultant.
Outcome Measures and Measurement
After 1 year, all responders to the baseline questionnaire were invited to fill out the same questionnaire again. Change in QM was the difference in the overall mean score in QM at the start of the study and after 1 year. The type of and time spent on the intervention were monitored per organization.
At the end of the second questionnaire, responders were asked whether the intervention had inspired them to adjust their QM policy and if so, which domain or subdomains they had targeted. They were asked whether they used the toolbox and if so, which domains.
Statistical Analysis
The overall score and the scores in the 6 domains and 28 subdomains for both care groups and outpatient clinics were calculated using the mean weight given by the corresponding expert panel. As stated previously, the overall QM score of an organization is the mean score of the six domains (19). By subtracting the score in 2012 from the score in 2013, the 1-year changes in the overall scores and in the scores of the separate QM domains were calculated. Dependent t tests (if no normality, Wilcoxon matched-pairs signed rank sum t tests) showed whether changes occurred in the overall score and in the domain scores after 1 year. Additionally, to check whether the worst-performing organizations improved more, we performed stratified analyses for organizations performing below and above the median baseline QM scores. To check for selection bias in the response on the second questionnaire, the responders and nonresponders on the second questionnaire were compared regarding their scores on the first questionnaire (baseline) by Student t test (if no normality, Mann-Whitney U test). Linear regression analysis was used to analyze the association between the steps of the intervention and the 1-year change in overall score in both types of organizations, with providing only feedback through e-mail as the reference category.
Furthermore, we calculated the Cohen d to quantify the effect sizes of the change in QM before and after the intervention. Cohen d is the difference between two measurement groups divided by the pooled SD.
Analyses were performed using SPSS version 20.0 statistical software. All parameters were tested for normality, and the assumptions for regression analysis were checked. For all tests, P < 0.05 was considered significant.
Results
Care Groups
Participants
Of the initial 60 care groups that responded on the baseline measurement, 51 filled out the 1-year questionnaire (response rate 85%, which involves 53% of all care groups across the Netherlands). Participating care groups were spread across the whole country. Of the nine care groups that did not respond, two merged with another larger participating care group (Fig. 1). The baseline QM scores of the 51 responders (59.8%, 95% CI 57.0–62.6%) did not differ from the nine nonresponding diabetes care groups on the second questionnaire (58.3%, CI 52.0–64.5%, P = 0.67). The number of patients treated in the responding care groups (6,130, CI 4,638–7,627) did not differ from those treated in the nine nonresponding care groups (5,690, CI 1,246–10,134, P = 0.82) (Table 1).
. | Care groups . | Outpatient clinics . | ||||
---|---|---|---|---|---|---|
. | Nonresponder . | Responder . | P value* . | Nonresponder . | Responder . | P value* . |
Number of organizations | 9 | 51 | — | 16† | 28§ | — |
Number of patients | 5,690 (1,246–10,134) | 6,130 (4,638–7,627) | 0.82 | 1,929 (1,335–2,523) | 1,962 (1,600–2,323) | 0.92 |
Overall level of QM | 58.3 (52.0–64.5) | 59.8 (57.0–62.6) | 0.67 | 56.6 (48.2–65.0) | 65.7 (60.3–71.1) | 0.055 |
QM score in organization of care | 76.4 (70.5–82.3) | 71.1 (67.7–74.5) | 0.21 | 71.1 (63.6–78.6) | 80.0 (75.7–84.4) | 0.03 |
QM score in multidisciplinary teamwork | 63.5 (53.4–73.6) | 67.8 (62.4–73.1) | 0.52 | 66.4 (54.7–78.1) | 74.5 (66.9–82.0) | 0.22 |
QM score in patient centeredness | 44.2 (35.3–53.1) | 47.1 (42.6–51.7) | 0.61 | 55.0 (46.6–63.5) | 66.7 (61.2–72.2) | 0.02 |
QM score in performance results | 65.1 (58.8–71.3) | 63.0 (60.8–65.2) | 0.47 | 41.6‡ (29.2–53.9) | 55.1‖ (47.3–62.9) | 0.052 |
QM score in quality improvement policy | 48.0 (37.2–58.8) | 53.5 (49.7–57.2) | 0.26 | 45.1 (34.0–56.3) | 54.4 (46.5–62.2) | 0.16 |
QM score in management strategies | 52.5 (37.0–67.9) | 56.6 (51.6–61.6) | 0.53 | 54.8 (43.1–66.5) | 61.5 (54.0–69.0) | 0.30 |
. | Care groups . | Outpatient clinics . | ||||
---|---|---|---|---|---|---|
. | Nonresponder . | Responder . | P value* . | Nonresponder . | Responder . | P value* . |
Number of organizations | 9 | 51 | — | 16† | 28§ | — |
Number of patients | 5,690 (1,246–10,134) | 6,130 (4,638–7,627) | 0.82 | 1,929 (1,335–2,523) | 1,962 (1,600–2,323) | 0.92 |
Overall level of QM | 58.3 (52.0–64.5) | 59.8 (57.0–62.6) | 0.67 | 56.6 (48.2–65.0) | 65.7 (60.3–71.1) | 0.055 |
QM score in organization of care | 76.4 (70.5–82.3) | 71.1 (67.7–74.5) | 0.21 | 71.1 (63.6–78.6) | 80.0 (75.7–84.4) | 0.03 |
QM score in multidisciplinary teamwork | 63.5 (53.4–73.6) | 67.8 (62.4–73.1) | 0.52 | 66.4 (54.7–78.1) | 74.5 (66.9–82.0) | 0.22 |
QM score in patient centeredness | 44.2 (35.3–53.1) | 47.1 (42.6–51.7) | 0.61 | 55.0 (46.6–63.5) | 66.7 (61.2–72.2) | 0.02 |
QM score in performance results | 65.1 (58.8–71.3) | 63.0 (60.8–65.2) | 0.47 | 41.6‡ (29.2–53.9) | 55.1‖ (47.3–62.9) | 0.052 |
QM score in quality improvement policy | 48.0 (37.2–58.8) | 53.5 (49.7–57.2) | 0.26 | 45.1 (34.0–56.3) | 54.4 (46.5–62.2) | 0.16 |
QM score in management strategies | 52.5 (37.0–67.9) | 56.6 (51.6–61.6) | 0.53 | 54.8 (43.1–66.5) | 61.5 (54.0–69.0) | 0.30 |
Data are % (95% CI) unless otherwise indicated.
By one-way ANOVA.
On behalf of 19 outpatient clinics.
n = 10.
On behalf of 33 outpatient clinics.
n = 22.
Intervention
Of the 51 participating care groups, 27 wanted support and 24 were not interested in further support. Of those that wanted support, 17 received an elucidating telephone call of, on average, 0.8 h; eight received a visit by an experienced consultant of, on average, 8.1 h; and two could not be reached (Supplementary Data).
Quality managers were mainly interested in the domain patient centeredness with a special focus on the subdomains self-management support and individual care plan; in the domain organization of care, especially in the subdomain communication and information; and in the domain quality improvement policy, with a special focus on patient safety. Furthermore, they were interested how to improve QM by means of a plan-do-check-act cycle, which was part of the subdomain structural policy within the domain management strategies. Of the nine care groups that did not fill out the follow-up questionnaire, three received a telephone call, two received a visit, and four were not interested in support (Fig. 1).
Change in Level of QM
After the intervention, care groups (n = 51) showed an overall mean QM score of 65.1%. This is an improvement of 5.3% (P ≤ 0.0001) compared with their overall score in the previous year. Stratified analysis showed that care groups with a baseline QM score below the median score of 59.6% improved by 8.9% (6.2–11.5%, P < 0.0000), whereas the better performers did not improve (1.7%, −1.1 to 4.5%, P = 0.24). The improvements in the mean scores in each of the six QM domains were also statistically significant (Table 2).
. | QM at baseline 2012 (n = 51)* . | QM after the intervention 2013 (n = 51) . | Change in QM . | ||||
---|---|---|---|---|---|---|---|
Domain and subdomain . | Mean (%) . | 95% CI . | Mean (%) . | 95% CI . | Mean (%) . | 95% CI . | P value* . |
Care program | 82.4 | 79.6–85.3 | 85.3 | 82.3–88.3 | 2.9 | −0.1–5.9 | 0.057 |
Continuity and coordination | 64.3 | 59.0–69.7 | 67.3 | 62.9–71.8 | 3.0 | −1.4–7.4 | 0.18 |
Communication and information | 65.5 | 57.9–73.1 | 74.1 | 67.4–80.9 | 8.6 | 2.8–14.4 | 0.004 |
Organization of care | 71.1 | 67.6–74.9 | 75.9 | 72.6–79.2 | 4.8 | 2.3–7.4 | 0.000 |
Work agreement | 62.8 | 58.9–66.7 | 69.1 | 65.1–73.0 | 6.3 | 1.8–10.7 | 0.007 |
Tasks and responsibilities | 71.5 | 63.8–79.2 | 78.1 | 71.2–85.0 | 6.6 | −3.0–16.3 | 0.17 |
Teamwork/consultation/shared education/guidelines | 75.1 | 68.3–81.8 | 81.5 | 76.9–86.0 | 6.4 | −0.8–13.6 | 0.08 |
Transfer and referral | 60.2 | 50.9–69.5 | 63.9 | 56.9–70.8 | 3.7 | −6.4–13.7 | 0.47 |
Multidisciplinary teamwork | 67.8 | 62.4–73.1 | 73.6 | 69.9–77.3 | 5.8 | 0.7–11.0 | 0.03 |
Self-management | 69.1 | 59.4–78.9 | 76.7 | 67.0–86.5 | 7.6 | −3.1–18.3 | 0.16 |
Individual care plan | 39.9 | 33.0–46.8 | 46.4 | 41.1–51.7 | 6.5 | −1.1–14.2 | 0.09 |
Policy on patient education | 56.9 | 49.0–64.7 | 58.6 | 50.1–67.1 | 1.7 | −7.5–10.9 | 0.71 |
Inspection of medical file | 42.0 | 33.8–50.2 | 49.4 | 40.7–58.0 | 7.4 | 0.2–14.5 | 0.04 |
Patient interests | 58.1 | 52.5–63.6 | 66.2 | 61.3–71.0 | 8.1 | 3.0–13.3 | 0.003 |
Patient involvement | 18.2 | 13.2–23.2 | 23.1 | 17.1–29.1 | 4.9 | −0.06–10.4 | 0.08 |
Patient centeredness | 47.1 | 42.6–51.7 | 53.3 | 49.1–57.6 | 6.2 | 2.4–10.0 | 0.002 |
Registering results | 60.1 | 54.3–65.8 | 68.4 | 61.6–75.3 | 8.4 | 1.1–15.7 | 0.02 |
Control of results | 29.8 | 24.5–35.1 | 35.1 | 28.0–42.2 | 5.3 | −2.3–12.9 | 0.17 |
Processing of results | 71.1 | 65.7–76.5 | 71.9 | 66.2–77.6 | 0.8 | −4.6–6.2 | 0.76 |
Analyzing results | 51.1 | 46.5–55.7 | 50.7 | 45.1–56.2 | −0.4 | −7.4–6.5 | 0.90 |
Contents of results | 98.0 | 95.3–100 | 98.0 | 95.3–100 | 0.0 | −4.0–4.0 | 1.00 |
Performance management | 63.0 | 60.8–65.2 | 66.3 | 63.4–69.2 | 3.3 | 3.5–6.3 | 0.03 |
Elements of quality improvement | 43.9 | 37.9–49.9 | 53.5 | 48.9–58.2 | 9.6 | 4.3–14.9 | 0.001 |
Feedback/benchmark | 72.5 | 67.8–77.2 | 77.8 | 72.3–83.2 | 5.3 | 0.2–10.5 | 0.04 |
Visitation | 43.6 | 35.5–51.7 | 49.5 | 43.0–56.0 | 5.9 | −2.9–14.7 | 0.19 |
Education | 72.9 | 67.5–78.3 | 76.7 | 72.0–81.5 | 3.8 | −1.7–9.3 | 0.17 |
Patient safety | 16.5 | 11.5–21.5 | 24.8 | 18.3–31.4 | 8.3 | 1.2–15.5 | 0.02 |
Subgroups | 35.8 | 27.7–43.9 | 34.5 | 26.8–42.2 | −1.3 | −9.9–7.3 | 0.76 |
Quality improvement policy | 53.5 | 49.7–57.2 | 58.5 | 55.4–61.6 | 5.0 | 1.5–8.6 | 0.007 |
Structural policy | 63.4 | 58.5–68.3 | 70.0 | 65.7–74.2 | 6.6 | 2.3–10.9 | 0.003 |
Quality system | 36.6 | 27.9–45.3 | 41.2 | 33.3–49.1 | 4.6 | −6.0–15.1 | 0.39 |
Quality documents | 56.3 | 48.8–63.7 | 64.1 | 57.8–70.4 | 7.8 | 0.1–15.6 | 0.049 |
Management strategies | 56.6 | 51.6–61.6 | 63.2 | 59.1–67.3 | 6.6 | 1.9–11.2 | 0.006 |
Mean total score | 59.8 | 57.0–62.6 | 65.1 | 62.8–67.5 | 5.3 | 3.2–7.4 | 0.000 |
. | QM at baseline 2012 (n = 51)* . | QM after the intervention 2013 (n = 51) . | Change in QM . | ||||
---|---|---|---|---|---|---|---|
Domain and subdomain . | Mean (%) . | 95% CI . | Mean (%) . | 95% CI . | Mean (%) . | 95% CI . | P value* . |
Care program | 82.4 | 79.6–85.3 | 85.3 | 82.3–88.3 | 2.9 | −0.1–5.9 | 0.057 |
Continuity and coordination | 64.3 | 59.0–69.7 | 67.3 | 62.9–71.8 | 3.0 | −1.4–7.4 | 0.18 |
Communication and information | 65.5 | 57.9–73.1 | 74.1 | 67.4–80.9 | 8.6 | 2.8–14.4 | 0.004 |
Organization of care | 71.1 | 67.6–74.9 | 75.9 | 72.6–79.2 | 4.8 | 2.3–7.4 | 0.000 |
Work agreement | 62.8 | 58.9–66.7 | 69.1 | 65.1–73.0 | 6.3 | 1.8–10.7 | 0.007 |
Tasks and responsibilities | 71.5 | 63.8–79.2 | 78.1 | 71.2–85.0 | 6.6 | −3.0–16.3 | 0.17 |
Teamwork/consultation/shared education/guidelines | 75.1 | 68.3–81.8 | 81.5 | 76.9–86.0 | 6.4 | −0.8–13.6 | 0.08 |
Transfer and referral | 60.2 | 50.9–69.5 | 63.9 | 56.9–70.8 | 3.7 | −6.4–13.7 | 0.47 |
Multidisciplinary teamwork | 67.8 | 62.4–73.1 | 73.6 | 69.9–77.3 | 5.8 | 0.7–11.0 | 0.03 |
Self-management | 69.1 | 59.4–78.9 | 76.7 | 67.0–86.5 | 7.6 | −3.1–18.3 | 0.16 |
Individual care plan | 39.9 | 33.0–46.8 | 46.4 | 41.1–51.7 | 6.5 | −1.1–14.2 | 0.09 |
Policy on patient education | 56.9 | 49.0–64.7 | 58.6 | 50.1–67.1 | 1.7 | −7.5–10.9 | 0.71 |
Inspection of medical file | 42.0 | 33.8–50.2 | 49.4 | 40.7–58.0 | 7.4 | 0.2–14.5 | 0.04 |
Patient interests | 58.1 | 52.5–63.6 | 66.2 | 61.3–71.0 | 8.1 | 3.0–13.3 | 0.003 |
Patient involvement | 18.2 | 13.2–23.2 | 23.1 | 17.1–29.1 | 4.9 | −0.06–10.4 | 0.08 |
Patient centeredness | 47.1 | 42.6–51.7 | 53.3 | 49.1–57.6 | 6.2 | 2.4–10.0 | 0.002 |
Registering results | 60.1 | 54.3–65.8 | 68.4 | 61.6–75.3 | 8.4 | 1.1–15.7 | 0.02 |
Control of results | 29.8 | 24.5–35.1 | 35.1 | 28.0–42.2 | 5.3 | −2.3–12.9 | 0.17 |
Processing of results | 71.1 | 65.7–76.5 | 71.9 | 66.2–77.6 | 0.8 | −4.6–6.2 | 0.76 |
Analyzing results | 51.1 | 46.5–55.7 | 50.7 | 45.1–56.2 | −0.4 | −7.4–6.5 | 0.90 |
Contents of results | 98.0 | 95.3–100 | 98.0 | 95.3–100 | 0.0 | −4.0–4.0 | 1.00 |
Performance management | 63.0 | 60.8–65.2 | 66.3 | 63.4–69.2 | 3.3 | 3.5–6.3 | 0.03 |
Elements of quality improvement | 43.9 | 37.9–49.9 | 53.5 | 48.9–58.2 | 9.6 | 4.3–14.9 | 0.001 |
Feedback/benchmark | 72.5 | 67.8–77.2 | 77.8 | 72.3–83.2 | 5.3 | 0.2–10.5 | 0.04 |
Visitation | 43.6 | 35.5–51.7 | 49.5 | 43.0–56.0 | 5.9 | −2.9–14.7 | 0.19 |
Education | 72.9 | 67.5–78.3 | 76.7 | 72.0–81.5 | 3.8 | −1.7–9.3 | 0.17 |
Patient safety | 16.5 | 11.5–21.5 | 24.8 | 18.3–31.4 | 8.3 | 1.2–15.5 | 0.02 |
Subgroups | 35.8 | 27.7–43.9 | 34.5 | 26.8–42.2 | −1.3 | −9.9–7.3 | 0.76 |
Quality improvement policy | 53.5 | 49.7–57.2 | 58.5 | 55.4–61.6 | 5.0 | 1.5–8.6 | 0.007 |
Structural policy | 63.4 | 58.5–68.3 | 70.0 | 65.7–74.2 | 6.6 | 2.3–10.9 | 0.003 |
Quality system | 36.6 | 27.9–45.3 | 41.2 | 33.3–49.1 | 4.6 | −6.0–15.1 | 0.39 |
Quality documents | 56.3 | 48.8–63.7 | 64.1 | 57.8–70.4 | 7.8 | 0.1–15.6 | 0.049 |
Management strategies | 56.6 | 51.6–61.6 | 63.2 | 59.1–67.3 | 6.6 | 1.9–11.2 | 0.006 |
Mean total score | 59.8 | 57.0–62.6 | 65.1 | 62.8–67.5 | 5.3 | 3.2–7.4 | 0.000 |
Boldface italic text indicates total scores, which are a weighted average (by an expert panel).
Not all values were normally distributed; a check with the Wilcoxon signed rank sum test did not change the results.
Association Between the Steps of the Intervention and the Change in QM
Only the feedback and benchmark by e-mail, including access to the toolbox, were associated with an improvement of 5.5% (P = 0.001) and not with the tailored support (data not shown). Linear regression between the time spent on the intervention and change in total mean score of QM revealed no significant associations. Cohen d for care groups was 0.57, which implies a medium effect size.
Outpatient Clinics
Participants
Of the 44 initial responders on behalf of 52 outpatient clinics, 28 responded on behalf of 33 outpatient clinics from all regions in the country (response rate 63%) to the 1-year questionnaire (Fig. 1). As a result, 32% of all outpatient clinics completed both questionnaires. The baseline QM scores of the 16 nonresponding outpatient clinics on the second questionnaire were lower (56.6%, CI 48.2–65.0%) compared with the baseline QM score of the 28 responding outpatient clinics (65.7%, CI 60.3–71.1%, P = 0.055). The mean number of patients treated in the responding outpatient clinics (1,962, CI 1,600–2,323) did not differ from the number of patients treated in the nonresponding outpatient clinics (1,929, CI 1,335–2,523, P = 0.92) (Table 1).
Intervention
Of the 28 responders, 17 were not interested in further support, and 11 wanted more information on the study results. From the latter, seven received an elucidating telephone call of, on average, 0.3 h; no outpatient clinic received a visit by an experienced consultant; and four managers of outpatient clinics could not be reached within the time specified. Of the outpatient clinics, 16 did not fill out the follow-up questionnaire, 4 received a telephone call, 1 could not be reached, and 11 were not interested in further support (Fig. 1).
Change in Level of QM
On the second questionnaire, outpatient clinics (n = 28) showed an overall mean score in QM of 67.3%, which was only a small improvement of 1.6% (P = 0.30) compared with the baseline results. Stratified analysis showed that outpatient clinics with a baseline QM score below the median score of 67.2% improved by 4.8% (0.4–9.3%, P = 0.04), whereas the better performers did not improve (−1.5%, −5.8 to 2.7%; P = 0.44). Of their mean scores in the six QM domains, only the results in the domain multidisciplinary teamwork improved (P = 0.001) compared with the previous year. Further results in the subdomains are described in Table 3.
. | QM at baseline 2012 (n = 28)* . | QM after the intervention 2013 (n = 28) . | Change in QM . | ||||
---|---|---|---|---|---|---|---|
Domain and subdomain . | Mean (%) . | 95% CI . | Mean (%) . | 95% CI . | Mean (%) . | 95% CI . | P value† . |
Care program | 80.8 | 75.9–85.6 | 81.5 | 76.8–86.1 | 0.7 | −5.5–7.0 | 0.81 |
Continuity and coordination | 78.4 | 72.6–84.2 | 76.8 | 70.7–83.0 | −1.6 | −8.1–4.9 | 0.62 |
Communication and information | 80.8 | 74.2–87.3 | 85.3 | 79.4–91.2 | 4.5 | −0.4–9.4 | 0.07 |
Organization of care | 80.0 | 75.7–84.4 | 81.4 | 77.5–85.3 | 1.4 | −2.3–5.0 | 0.44 |
Work agreement | 72.2 | 62.2–82.3 | 75.0 | 66.9–83.1 | 2.8 | −3.8–9.4 | 0.39 |
Tasks and responsibilities | 81.5 | 74.4–88.5 | 90.4 | 85.7–95.1 | 8.9 | 3.4–14.5 | 0.003 |
Teamwork/consultation/shared education/guidelines | 73.9 | 64.8–83.0 | 83.5 | 75.0–92.0 | 9.6 | 4.5–14.6 | 0.001 |
Transfer and referral | 56.4 | 44.6–68.2 | 77.5 | 67.6–87.4 | 21.1 | 8.2–34.1 | 0.002 |
Diabetic foot team | 85.7 | 72.9–98.5 | 88.9 | 77.9–99.9 | 3.2 | −7.3–13.7 | 0.54 |
Multidisciplinary teamwork | 74.5 | 66.9–82.0 | 83.2 | 76.6–89.8 | 8.7 | 3.7–13.7 | 0.001 |
Self-management | 89.7 | 81.5–98.0 | 82.1 | 71.6–92.7 | −7.6 | −18.4–3.2 | 0.16 |
Individual care plan | 53.0 | 40.4–65.5 | 53.6 | 42.8–64.3 | 0.6 | −11.9–13.1 | 0.92 |
Policy on patient education | 78.0 | 68.3–87.6 | 84.5 | 77.1–92.0 | 6.5 | −3.1–16.2 | 0.18 |
Inspection of medical file | 33.0 | 23.0–43.1 | 32.7 | 23.5–42.0 | −0.3 | −8.1–7.5 | 0.94 |
Patient interests | 80.5 | 74.1–86.9 | 77.7 | 69.7–85.7 | −2.8 | −10.0–4.3 | 0.43 |
Patient involvement | 31.0 | 20.4–41.6 | 33.9 | 21.6–46.2 | 2.9 | −11.8–17.6 | 0.69 |
Patient centeredness | 66.7 | 61.2–72.2 | 65.6 | 59.2–72.0 | −1.1 | −4.9–2.6 | 0.54 |
Registering results | 62.7* | 50.2–75.2* | 68.4 | 58.8–78.0 | 4.8* | −10.7–20.3* | 0.53* |
Control of results | 43.2* | 32.3–54.1* | 41.1 | 28.9–53.2 | 1.1* | −15.1–17.4* | 0.89* |
Processing of results | 49.2* | 39.2–59.3* | 45.5 | 37.2–53.9 | −2.7* | −9.4–4.1* | 0.43* |
Analyzing results | 38.6* | 27.3–50.0* | 39.4 | 31.0–47.8 | 2.7* | −4.6–10.0* | 0.45* |
Contents of results | 72.7* | 62.0–83.5* | 69.6 | 31.0–47.8 | 0.0* | −6.8–6.8* | 1.00* |
Performance management | 55.1* | 47.3–62.9* | 55.5 | 60.4–78.9 | 0.2* | −5.5–9.3* | 0.60* |
Elements of quality improvement | 60.9 | 49.6–72.2 | 44.6 | 39.1–50.2 | −16.3 | −26.6 to −5.9 | 0.003 |
Feedback/benchmark | 48.0 | 36.6–59.5 | 54.6 | 45.2–64.0 | 6.6 | −5.8–18.9 | 0.29 |
Visitation | 45.2 | 35.0–55.3 | 48.2 | 39.0–57.4 | 3.1 | −4.8–10.9 | 0.43 |
Education | 65.1 | 53.4–76.9 | 70.2 | 60.3–80.2 | 5.1 | −3.5–13.8 | 0.23 |
Patient safety | 68.5 | 59.8–77.1 | 71.4 | 63.9–79.0 | 3.0 | −6.5–12.5 | 0.53 |
Subgroups | 41.7 | 30.7–52.7 | 38.9 | 28.4–49.4 | −2.8 | −15.5–9.9 | 0.65 |
Quality improvement policy | 54.4 | 46.5–62.2 | 55.0 | 49.3–60.6 | 0.6 | −5.7–6.8 | 0.85 |
Structural policy | 51.8 | 44.9–58.7 | 56.3 | 49.9–62.8 | 4.5 | −1.6–10.7 | 0.14 |
Quality system | 67.9 | 49.4–86.3 | 60.7 | 41.4–80.0 | −7.1 | −28.1–13.8 | 0.49 |
Quality documents | 72.4 | 63.9–81.0 | 75.9 | 68.7–83.1 | 3.5 | −4.4–11.3 | 0.37 |
Management strategies | 61.5 | 54.0–69.0 | 63.1 | 55.6–70.6 | 1.6 | −5.5–8.7 | 0.65 |
Mean total score | 65.7 | 60.3–71.1 | 67.3 | 62.9–71.7 | 1.6 | −1.5–4.7 | 0.30 |
. | QM at baseline 2012 (n = 28)* . | QM after the intervention 2013 (n = 28) . | Change in QM . | ||||
---|---|---|---|---|---|---|---|
Domain and subdomain . | Mean (%) . | 95% CI . | Mean (%) . | 95% CI . | Mean (%) . | 95% CI . | P value† . |
Care program | 80.8 | 75.9–85.6 | 81.5 | 76.8–86.1 | 0.7 | −5.5–7.0 | 0.81 |
Continuity and coordination | 78.4 | 72.6–84.2 | 76.8 | 70.7–83.0 | −1.6 | −8.1–4.9 | 0.62 |
Communication and information | 80.8 | 74.2–87.3 | 85.3 | 79.4–91.2 | 4.5 | −0.4–9.4 | 0.07 |
Organization of care | 80.0 | 75.7–84.4 | 81.4 | 77.5–85.3 | 1.4 | −2.3–5.0 | 0.44 |
Work agreement | 72.2 | 62.2–82.3 | 75.0 | 66.9–83.1 | 2.8 | −3.8–9.4 | 0.39 |
Tasks and responsibilities | 81.5 | 74.4–88.5 | 90.4 | 85.7–95.1 | 8.9 | 3.4–14.5 | 0.003 |
Teamwork/consultation/shared education/guidelines | 73.9 | 64.8–83.0 | 83.5 | 75.0–92.0 | 9.6 | 4.5–14.6 | 0.001 |
Transfer and referral | 56.4 | 44.6–68.2 | 77.5 | 67.6–87.4 | 21.1 | 8.2–34.1 | 0.002 |
Diabetic foot team | 85.7 | 72.9–98.5 | 88.9 | 77.9–99.9 | 3.2 | −7.3–13.7 | 0.54 |
Multidisciplinary teamwork | 74.5 | 66.9–82.0 | 83.2 | 76.6–89.8 | 8.7 | 3.7–13.7 | 0.001 |
Self-management | 89.7 | 81.5–98.0 | 82.1 | 71.6–92.7 | −7.6 | −18.4–3.2 | 0.16 |
Individual care plan | 53.0 | 40.4–65.5 | 53.6 | 42.8–64.3 | 0.6 | −11.9–13.1 | 0.92 |
Policy on patient education | 78.0 | 68.3–87.6 | 84.5 | 77.1–92.0 | 6.5 | −3.1–16.2 | 0.18 |
Inspection of medical file | 33.0 | 23.0–43.1 | 32.7 | 23.5–42.0 | −0.3 | −8.1–7.5 | 0.94 |
Patient interests | 80.5 | 74.1–86.9 | 77.7 | 69.7–85.7 | −2.8 | −10.0–4.3 | 0.43 |
Patient involvement | 31.0 | 20.4–41.6 | 33.9 | 21.6–46.2 | 2.9 | −11.8–17.6 | 0.69 |
Patient centeredness | 66.7 | 61.2–72.2 | 65.6 | 59.2–72.0 | −1.1 | −4.9–2.6 | 0.54 |
Registering results | 62.7* | 50.2–75.2* | 68.4 | 58.8–78.0 | 4.8* | −10.7–20.3* | 0.53* |
Control of results | 43.2* | 32.3–54.1* | 41.1 | 28.9–53.2 | 1.1* | −15.1–17.4* | 0.89* |
Processing of results | 49.2* | 39.2–59.3* | 45.5 | 37.2–53.9 | −2.7* | −9.4–4.1* | 0.43* |
Analyzing results | 38.6* | 27.3–50.0* | 39.4 | 31.0–47.8 | 2.7* | −4.6–10.0* | 0.45* |
Contents of results | 72.7* | 62.0–83.5* | 69.6 | 31.0–47.8 | 0.0* | −6.8–6.8* | 1.00* |
Performance management | 55.1* | 47.3–62.9* | 55.5 | 60.4–78.9 | 0.2* | −5.5–9.3* | 0.60* |
Elements of quality improvement | 60.9 | 49.6–72.2 | 44.6 | 39.1–50.2 | −16.3 | −26.6 to −5.9 | 0.003 |
Feedback/benchmark | 48.0 | 36.6–59.5 | 54.6 | 45.2–64.0 | 6.6 | −5.8–18.9 | 0.29 |
Visitation | 45.2 | 35.0–55.3 | 48.2 | 39.0–57.4 | 3.1 | −4.8–10.9 | 0.43 |
Education | 65.1 | 53.4–76.9 | 70.2 | 60.3–80.2 | 5.1 | −3.5–13.8 | 0.23 |
Patient safety | 68.5 | 59.8–77.1 | 71.4 | 63.9–79.0 | 3.0 | −6.5–12.5 | 0.53 |
Subgroups | 41.7 | 30.7–52.7 | 38.9 | 28.4–49.4 | −2.8 | −15.5–9.9 | 0.65 |
Quality improvement policy | 54.4 | 46.5–62.2 | 55.0 | 49.3–60.6 | 0.6 | −5.7–6.8 | 0.85 |
Structural policy | 51.8 | 44.9–58.7 | 56.3 | 49.9–62.8 | 4.5 | −1.6–10.7 | 0.14 |
Quality system | 67.9 | 49.4–86.3 | 60.7 | 41.4–80.0 | −7.1 | −28.1–13.8 | 0.49 |
Quality documents | 72.4 | 63.9–81.0 | 75.9 | 68.7–83.1 | 3.5 | −4.4–11.3 | 0.37 |
Management strategies | 61.5 | 54.0–69.0 | 63.1 | 55.6–70.6 | 1.6 | −5.5–8.7 | 0.65 |
Mean total score | 65.7 | 60.3–71.1 | 67.3 | 62.9–71.7 | 1.6 | −1.5–4.7 | 0.30 |
Boldface italic text indicates total scores, which are a weighted average (by an expert panel).
Not all values were normally distributed; a check with the Wilcoxon signed rank sum test did not change the results.
n = 22.
Association Between the Steps of the Intervention and Change in QM
Neither the time spent during the intervention nor any steps of the intervention were associated with the change in the total score of QM. The Cohen d for outpatient clinics of 0.13 indicates a trivial effect size.
Visits to the Toolbox
Participating organizations contributed seven quality improvement instruments to the toolbox to share with other organizations. In 1 year, the website had 749 visitors, among whom were 486 unique visitors; the mean number of Web pages viewed per visit was six. The identity of the visitors could not be traced; however, the number of visitors rose after an e-mail about an update to the toolbox was sent.
Conclusions
In this nationwide study in the Netherlands, 53% of all care groups and 32% of all outpatient clinics completed QM questionnaires before and after a stepwise intervention. In care groups, the overall QM score and the scores in all six quality domains improved significantly. Mainly, the worst-performing organizations improved their overall QM. In outpatient clinics, only the domain multidisciplinary teamwork improved significantly; their overall QM did not change. The feedback and benchmark by radar diagram were associated with a significant improvement in QM in care groups; the tailored support was not.
There are several explanations for the achieved improvement in QM level in care groups. First, care groups have been established to facilitate continuously improving multidisciplinary diabetes care within the primary care field (21). Almost all of them are staffed with quality managers employed to improve the level of QM. Besides, health insurance companies, with whom care groups negotiate on price and quality of the contracted diabetes care, started to make organizational QM measurements obligatory for care groups (22). Mainly, the worst-performing organizations improved their QM, which might be partly due to a regression to the mean effect.
In outpatient clinics, the QM score before the intervention was already high and comparable to that of care groups after the intervention, and this overall level of QM did not improve. The response of the outpatient clinics to the second questionnaire was lower (63%); the better-performing ones on the baseline questionnaire filled out both questionnaires. Responsible endocrinologists were hard to reach and had no time available for the tailored support and seemed less interested in QM support. In addition, there is less external pressure on outpatient clinics from insurance companies. Until now, in diabetes outpatient clinics, QM is more an internal accountability factor and only part of the total (complex) hospital QM system, which might explain why there is less focus on diabetes QM as such. Although hospitals are increasingly obliged to provide more transparency on their quality of care (15), their performance indicators on diabetes care are not yet part of this public information. There is only one indicator on wound care for the diabetic foot that is integrated into the basic set of hospital quality indicators (23).
All participating organizations received feedback and benchmarks, but only a limited number of them desired further support. A review showed that feedback reports are useful, especially when they are supported by an educational implementation strategy or the development of a quality improvement plan (24). We recommend future implementation of systematic feedback and a benchmark because these are relatively easily to perform. The number of organizations that received tailored support was too low to get significant results. A process evaluation could give more insight into whether we should strive for QM support and for what type of support.
We tried to include all topics for proper diabetes care QM. Both questionnaires did not contain a question on acute diabetes care because it seems to not be a QM issue in the Netherlands. However, in daily practice, managers may decide which QM topic they want to focus on. For example, some organizations might prefer focusing on timeliness (Supplementary Data, question 5.1.5), whereas others might prioritize feedback or patient safety (Supplementary Data, questions 5.7–5.9).
The topics in QM are not static. For example, if patients should be referred to health-care providers chosen by their health insurance company, the free choice by the patient might become limited. Or, because the care group is both clinically and financially responsible for all patients assigned to the diabetes program, conflicts of interest might arise (13). Both examples would not be in the patients’ interest and could lead to a new weighing of the importance of the domains in the future.
In our opinion, focusing on QM is only justified if better QM leads to better patient outcomes. Until now, whether a good QM system results in better process and outcome measures remains unclear. A meta-analysis of quality improvement strategies showed that interventions upon the entire system of chronic disease management were associated with improvements in (surrogate) patient outcomes (5), whereas a systematic review found that the structure of diabetes care was not associated with (surrogate) patient outcomes (25). The present study is the first to provide some insight into QM on an organizational level, but future study to explore the association between QM and patient outcomes is warranted.
To the best of our knowledge, this is the first time diabetes QM at the level of organizations has been measured in care groups and outpatient clinics. This measurement provided these organizations and groups with a benchmark on their QM and may add to the debate on QM within care groups and outpatient clinics. The Accountable Care Organizations in the U.S. are also focusing on improvement of their QM; they are determining how they can manufacture teamwork and are realizing that organizational learning is needed to accomplish good quality of care (14).
This study has some limitations. Selection bias existed in the outpatient clinic group in that mainly the better-performing clinics filled out the follow-up questionnaire. Apart from this intervention, much attention has been paid to improvement in diabetes care and QM by a broad 4-year national campaign by the Dutch Diabetes Federation (17). The present study was only a part of that campaign. For that reason, we tried to include all organizations, but they only could participate voluntarily. For the same reason, we had no control group, and we could not control for QM support that was given by other organizations or for trends that were already going on, such as better registration and extraction of quality indicators and more focus on self-management. Moreover, a limitation typical of research with self-assessment questionnaires is social desirability (26), and to reduce this, participating managers were guaranteed that feedback would only be delivered to their personal e-mail address, thus giving them the opportunity to hide this feedback.
For future study, the questionnaire needs to be validated further. Face and content validity were already warranted (27). Construct validity was based on the literature and a review of seven models for QM, resulting in the six domains for diabetes QM, but this still needs confirmatory factor analyses. Criterion-related validity could not be tested because no comparable instruments were available. An independent assessment was not feasible but might contribute to criterion validity testing in the future. The care group questionnaire showed an ability to measure change, but change as measured by the outpatient clinic questionnaire could not be confirmed. Because we now have more insight into QM on an organizational level, it might be interesting to study whether higher levels of QM will actually improve the care delivered to patients with type 2 diabetes and their subsequent diabetes indicators.
In conclusion, measuring QM and providing feedback and a benchmark improves the level of QM in care groups but not in outpatient clinics. The results should be interpreted cautiously because selection bias is likely to be present. Taking into account that the better-performing organizations participated and among them the relatively worse ones improved to a greater extent, we can argue that the level of QM in Dutch diabetes care organizations could improve to a larger extent than could be demonstrated. Our approach might also be a useful asset for other diabetes care groups such as Accountable Care Organizations.
Article Information
Acknowledgments. The authors thank Jolanda Groothuis (Knowledge Center for Shared Care) for support in the study design, development of the questionnaires, and intervention and Klementine van Vuure (Knowledge Center for Shared Care) and Kees Gorter (Julius Center for Health Sciences and Primary Care) for support in the study design and development of the questionnaires.
Funding. The study is part of the National Diabetes Action Program and received a research grant from the Dutch Diabetes Federation (grant no. NAD 3.05).
Duality of Interest. No potential conflicts of interest relevant to this article were reported.
Author Contributions. M.J.C.-K. contributed to the coordination of the study, development of the questionnaires, data research and analysis, and writing and final approval of the manuscript. C.A.B. contributed to the research question, study design, funding acquisition, and drafting and final approval of the manuscript. L.C.L. contributed to the development of the questionnaires and drafting and final approval of the manuscript. G.E.R. contributed to the research question, study design, funding acquisition, development of the questionnaires, and drafting and final approval of the manuscript. G.E.R. is the guarantor of this work and, as such, had full access to all the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.